Sample records for project dose code

  1. Acute Radiation Risk and BRYNTRN Organ Dose Projection Graphical User Interface

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Hu, Shaowen; Nounu, Hateni N.; Kim, Myung-Hee

    2011-01-01

    The integration of human space applications risk projection models of organ dose and acute radiation risk has been a key problem. NASA has developed an organ dose projection model using the BRYNTRN with SUM DOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUM DOSE are a Baryon transport code and an output data processing code, respectively. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN. A GUI for the ARR and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. BRYNTRN code operation requires extensive input preparation. Only a graphical user interface (GUI) can handle input and output for BRYNTRN to the response models easily and correctly. The purpose of the GUI development for ARRBOD is to provide seamless integration of input and output manipulations for the operations of projection modules (BRYNTRN, SLMDOSE, and the ARR probabilistic response model) in assessing the acute risk and the organ doses of significant Solar Particle Events (SPEs). The assessment of astronauts radiation risk from SPE is in support of mission design and operational planning to manage radiation risks in future space missions. The ARRBOD GUI can identify the proper shielding solutions using the gender-specific organ dose assessments in order to avoid ARR symptoms, and to stay within the current NASA short-term dose limits. The quantified evaluation of ARR severities based on any given shielding configuration and a specified EVA or other mission scenario can be made to guide alternative solutions for attaining determined objectives set by mission planners. The ARRBOD GUI estimates the whole-body effective dose, organ doses, and acute radiation sickness symptoms for astronauts, by which operational strategies and capabilities can be made for the protection of astronauts from SPEs in the planning of future lunar surface scenarios, exploration of near-Earth objects, and missions to Mars.

  2. Identification of Trends into Dose Calculations for Astronauts through Performing Sensitivity Analysis on Calculational Models Used by the Radiation Health Office

    NASA Technical Reports Server (NTRS)

    Adams, Thomas; VanBaalen, Mary

    2009-01-01

    The Radiation Health Office (RHO) determines each astronaut s cancer risk by using models to associate the amount of radiation dose that astronauts receive from spaceflight missions. The baryon transport codes (BRYNTRN), high charge (Z) and energy transport codes (HZETRN), and computer risk models are used to determine the effective dose received by astronauts in Low Earth orbit (LEO). This code uses an approximation of the Boltzman transport formula. The purpose of the project is to run this code for various International Space Station (ISS) flight parameters in order to gain a better understanding of how this code responds to different scenarios. The project will determine how variations in one set of parameters such as, the point of the solar cycle and altitude can affect the radiation exposure of astronauts during ISS missions. This project will benefit NASA by improving mission dosimetry.

  3. Development of Graphical User Interface for ARRBOD (Acute Radiation Risk and BRYNTRN Organ Dose Projection)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2010-01-01

    The space radiation environment, particularly solar particle events (SPEs), poses the risk of acute radiation sickness (ARS) to humans; and organ doses from SPE exposure may reach critical levels during extra vehicular activities (EVAs) or within lightly shielded spacecraft. NASA has developed an organ dose projection model using the BRYNTRN with SUMDOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUMDOSE, written in FORTRAN, are a Baryon transport code and an output data processing code, respectively. The ARR code is written in C. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. BRYNTRN code operation requires extensive input preparation. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN in friendly way. A GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. The ARRBOD GUI will serve as a proof-of-concept example for future integration of other human space applications risk projection models. The current version of the ARRBOD GUI is a new self-contained product and will have follow-on versions, as options are added: 1) human geometries of MAX/FAX in addition to CAM/CAF; 2) shielding distributions for spacecraft, Mars surface and atmosphere; 3) various space environmental and biophysical models; and 4) other response models to be connected to the BRYNTRN. The major components of the overall system, the subsystem interconnections, and external interfaces are described in this report; and the ARRBOD GUI product is explained step by step in order to serve as a tutorial.

  4. Development of Safety Assessment Code for Decommissioning of Nuclear Facilities

    NASA Astrophysics Data System (ADS)

    Shimada, Taro; Ohshima, Soichiro; Sukegawa, Takenori

    A safety assessment code, DecDose, for decommissioning of nuclear facilities has been developed, based on the experiences of the decommissioning project of Japan Power Demonstration Reactor (JPDR) at Japan Atomic Energy Research Institute (currently JAEA). DecDose evaluates the annual exposure dose of the public and workers according to the progress of decommissioning, and also evaluates the public dose at accidental situations including fire and explosion. As for the public, both the internal and the external doses are calculated by considering inhalation, ingestion, direct radiation from radioactive aerosols and radioactive depositions, and skyshine radiation from waste containers. For external dose for workers, the dose rate from contaminated components and structures to be dismantled is calculated. Internal dose for workers is calculated by considering dismantling conditions, e.g. cutting speed, cutting length of the components and exhaust velocity. Estimation models for dose rate and staying time were verified by comparison with the actual external dose of workers which were acquired during JPDR decommissioning project. DecDose code is expected to contribute the safety assessment for decommissioning of nuclear facilities.

  5. Overview of Graphical User Interface for ARRBOD (Acute Radiation Risk and BRYNTRN Organ Dose Projection)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2010-01-01

    Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts, because organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. Assessment of astronauts organ doses and ARS from the exposure to historically large SPEs is in support of mission design and operation planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI product, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenzie-Carter, M.A.; Lyon, R.E.; Rope, S.K.

    This report contains information to support the Environmental Assessment for the Burning Plasma Experiment (BPX) Project proposed for the Princeton Plasma Physics Laboratory (PPPL). The assumptions and methodology used to assess the impact to members of the public from operational and accidental releases of radioactive material from the proposed BPX during the operational period of the project are described. A description of the tracer release tests conducted at PPPL by NOAA is included; dispersion values from these tests are used in the dose calculations. Radiological releases, doses, and resulting health risks are calculated and summarized. The computer code AIRDOS- EPA,more » which is part of the computer code system CAP-88, is used to calculate the individual and population doses for routine releases; FUSCRAC3 is used to calculate doses resulting from off-normal releases where direct application of the NOAA tracer test data is not practical. Where applicable, doses are compared to regulatory limits and guideline values. 48 refs., 16 tabs.« less

  7. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less

  8. DOSE COEFFICIENTS FOR LIVER CHEMOEMBOLISATION PROCEDURES USING MONTE CARLO CODE.

    PubMed

    Karavasilis, E; Dimitriadis, A; Gonis, H; Pappas, P; Georgiou, E; Yakoumakis, E

    2016-12-01

    The aim of the present study is the estimation of radiation burden during liver chemoembolisation procedures. Organ dose and effective dose conversion factors, normalised to dose-area product (DAP), were estimated for chemoembolisation procedures using a Monte Carlo transport code in conjunction with an adult mathematical phantom. Exposure data from 32 patients were used to determine the exposure projections for the simulations. Equivalent organ (H T ) and effective (E) doses were estimated using individual DAP values. The organs receiving the highest amount of doses during these exams were lumbar spine, liver and kidneys. The mean effective dose conversion factor was 1.4 Sv Gy -1 m -2 Dose conversion factors can be useful for patient-specific radiation burden during chemoembolisation procedures. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pater, P; Vallieres, M; Seuntjens, J

    2014-06-15

    Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dosemore » deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  10. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    PubMed

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  11. Peak Dose Assessment for Proposed DOE-PPPO Authorized Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maldonado, Delis

    2012-06-01

    The Oak Ridge Institute for Science and Education (ORISE), a U.S. Department of Energy (DOE) prime contractor, was contracted by the DOE Portsmouth/Paducah Project Office (DOE-PPPO) to conduct a peak dose assessment in support of the Authorized Limits Request for Solid Waste Disposal at Landfill C-746-U at the Paducah Gaseous Diffusion Plant (DOE-PPPO 2011a). The peak doses were calculated based on the DOE-PPPO Proposed Single Radionuclides Soil Guidelines and the DOE-PPPO Proposed Authorized Limits (AL) Volumetric Concentrations available in DOE-PPPO 2011a. This work is provided as an appendix to the Dose Modeling Evaluations and Technical Support Document for the Authorizedmore » Limits Request for the C-746-U Landfill at the Paducah Gaseous Diffusion Plant, Paducah, Kentucky (ORISE 2012). The receptors evaluated in ORISE 2012 were selected by the DOE-PPPO for the additional peak dose evaluations. These receptors included a Landfill Worker, Trespasser, Resident Farmer (onsite), Resident Gardener, Recreational User, Outdoor Worker and an Offsite Resident Farmer. The RESRAD (Version 6.5) and RESRAD-OFFSITE (Version 2.5) computer codes were used for the peak dose assessments. Deterministic peak dose assessments were performed for all the receptors and a probabilistic dose assessment was performed only for the Offsite Resident Farmer at the request of the DOE-PPPO. In a deterministic analysis, a single input value results in a single output value. In other words, a deterministic analysis uses single parameter values for every variable in the code. By contrast, a probabilistic approach assigns parameter ranges to certain variables, and the code randomly selects the values for each variable from the parameter range each time it calculates the dose (NRC 2006). The receptor scenarios, computer codes and parameter input files were previously used in ORISE 2012. A few modifications were made to the parameter input files as appropriate for this effort. Some of these changes included increasing the time horizon beyond 1,050 years (yr), and using the radionuclide concentrations provided by the DOE-PPPO as inputs into the codes. The deterministic peak doses were evaluated within time horizons of 70 yr (for the Landfill Worker and Trespasser), 1,050 yr, 10,000 yr and 100,000 yr (for the Resident Farmer [onsite], Resident Gardener, Recreational User, Outdoor Worker and Offsite Resident Farmer) at the request of the DOE-PPPO. The time horizons of 10,000 yr and 100,000 yr were used at the request of the DOE-PPPO for informational purposes only. The probabilistic peak of the mean dose assessment was performed for the Offsite Resident Farmer using Technetium-99 (Tc-99) and a time horizon of 1,050 yr. The results of the deterministic analyses indicate that among all receptors and time horizons evaluated, the highest projected dose, 2,700 mrem/yr, occurred for the Resident Farmer (onsite) at 12,773 yr. The exposure pathways contributing to the peak dose are ingestion of plants, external gamma, and ingestion of milk, meat and soil. However, this receptor is considered an implausible receptor. The only receptors considered plausible are the Landfill Worker, Recreational User, Outdoor Worker and the Offsite Resident Farmer. The maximum projected dose among the plausible receptors is 220 mrem/yr for the Outdoor Worker and it occurs at 19,045 yr. The exposure pathways contributing to the dose for this receptor are external gamma and soil ingestion. The results of the probabilistic peak of the mean dose analysis for the Offsite Resident Farmer indicate that the average (arithmetic mean) of the peak of the mean doses for this receptor is 0.98 mrem/yr and it occurs at 1,050 yr. This dose corresponds to Tc-99 within the time horizon of 1,050 yr.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenzie-Carter, M.A.; Lyon, R.E.

    This report contains information to support the Environmental Assessment for the Compact Ignition Tokamak Project (CIT) proposed for Princeton Plasma Physics Laboratory (PPPL). The assumptions and methodology used to assess the impact to members of the public from operational and accidental releases of radioactive material from the proposed CIT during the operational period of the project are described. A description of the tracer release tests conducted at PPPL by NOAA is included; dispersion values from these tests are used in the dose calculation. Radiological releases, doses, and resulting health risks are calculated. The computer code AIRDOS-EPA is used to calculatemore » the individual and population doses for routine releases; FUSCRAC3 is used to calculate doses resulting from off-normal releases where direct application of the NOAA tracer test data is not practical. Where applicable, doses are compared to regulatory limits and guidelines values. 44 refs., 5 figs., 18 tabs.« less

  13. Calculated organ doses for Mayak production association central hall using ICRP and MCNP.

    PubMed

    Choe, Dong-Ok; Shelkey, Brenda N; Wilde, Justin L; Walk, Heidi A; Slaughter, David M

    2003-03-01

    As part of an ongoing dose reconstruction project, equivalent organ dose rates from photons and neutrons were estimated using the energy spectra measured in the central hall above the graphite reactor core located in the Russian Mayak Production Association facility. Reconstruction of the work environment was necessary due to the lack of personal dosimeter data for neutrons in the time period prior to 1987. A typical worker scenario for the central hall was developed for the Monte Carlo Neutron Photon-4B (MCNP) code. The resultant equivalent dose rates for neutrons and photons were compared with the equivalent dose rates derived from calculations using the conversion coefficients in the International Commission on Radiological Protection Publications 51 and 74 in order to validate the model scenario for this Russian facility. The MCNP results were in good agreement with the results of the ICRP publications indicating the modeling scenario was consistent with actual work conditions given the spectra provided. The MCNP code will allow for additional orientations to accurately reflect source locations.

  14. Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking

    NASA Technical Reports Server (NTRS)

    Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward

    2011-01-01

    To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk calculation procedure.

  15. Automated ISS Flight Utilities

    NASA Technical Reports Server (NTRS)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the EVADES output. As mentioned above, GEnEVADOSE makes extensive use of ROOT version 6, the data analysis framework developed at the European Organization for Nuclear Research (CERN), and the code is written to the C++11 standard (as are the other projects). My second project is the Automated Mission Reference Exposure Utility (AMREU).Unlike GEnEVADOSE, AMREU is a combination of three frameworks written in both Python and C++, also making use of ROOT (and PyROOT). Run as a combination of daily and weekly cron jobs, these macros query the SRAG database system to determine the active ISS missions, and query minute-by-minute radiation dose information from ISS-TEPC (Tissue Equivalent Proportional Counter), one of the radiation detectors onboard the ISS. Using this information, AMREU creates a corrected data set of daily radiation doses, addressing situations where TEPC may be offline or locked up by correcting doses for days with less than 95% live time (the total amount time the instrument acquires data) by averaging the past 7 days. As not all errors may be automatically detectable, AMREU also allows for manual corrections, checking an updated plaintext file each time it runs. With the corrected data, AMREU generates cumulative dose plots for each mission, and uses a Python script to generate a flight note file (.docx format) containing these plots, as well as information sections to be filled in and modified by the space weather environment officers with information specific to the week. AMREU is set up to run without requiring any user input, and it automatically archives old flight notes and information files for missions that are no longer active. My other projects involve cleaning up a large data set from the Charged Particle Directional Spectrometer (CPDS), joining together many different data sets in order to clean up information in SRAG SQL databases, and developing other automated utilities for displaying information on active solar regions, that may be used by the space weather environment officers to monitor solar activity. I consulted my mentor Dr. Ryan Rios and Dr. Kerry Lee for project requirements and added features, and ROOT developer Edmond Offermann for advice on using the ROOT library. I also received advice and feedback from Dr. Janet Barzilla of SRAG, who tested my code. Besides these inputs, I worked independently, writing all of the code by myself. The code for all these projects is documented throughout, and I have attempted to write it in a modular format. Assuming that ROOT is updated accordingly, these codes are also Y2038-compliant (and Y10K-compliant). This allows the code to be easily referenced, modified and possibly repurposed for non-ISS missions in the future, should the necessary inputs exist. These projects have taught me a lot about coding and software design - I have become a much more skilled C++ programmer and ROOT user, and I also learned to code in Python and PyROOT (and its advantages and disadvantages compared to C++/ ROOT). Furthermore, I have learned about space radiation and radiation modeling, topics that greatly interest me as I pursue a degree in physics. Working alongside experimental physicists like Dr. Rios, I have developed a greater understanding and appreciation for experimental science, something I have always leaned towards but to which I lacked significant exposure. My work in SRAG has also given me the invaluable opportunity to witness the work environment for physicists at NASA, and what a career in academia may look like at a government laboratory such as NASA Johnson Space Center. As I continue my studies and look forward to graduate school and a future career, this experience at NASA has given me a meaningful and enjoyable opportunity to put my skills to use and see what my future career path might hold.

  16. SU-G-JeP3-06: Lower KV Image Dose Are Expected From a Limited-Angle Intra-Fractional Verification (LIVE) System for SBRT Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, G; Yin, F; Ren, L

    Purpose: In order to track the tumor movement for patient positioning verification during arc treatment delivery or in between 3D/IMRT beams for stereotactic body radiation therapy (SBRT), the limited-angle kV projections acquisition simultaneously during arc treatment delivery or in-between static treatment beams as the gantry moves to the next beam angle was proposed. The purpose of this study is to estimate additional imaging dose resulting from multiple tomosynthesis acquisitions in-between static treatment beams and to compare with that of a conventional kV-CBCT acquisition. Methods: kV imaging system integrated into Varian TrueBeam accelerators was modeled using EGSnrc Monte Carlo user code,more » BEAMnrc and DOSXYZnrc code was used in dose calculations. The simulated realistic kV beams from the Varian TrueBeam OBI 1.5 system were used to calculate dose to patient based on CT images. Organ doses were analyzed using DVHs. The imaging dose to patient resulting from realistic multiple tomosynthesis acquisitions with each 25–30 degree kV source rotation between 6 treatment beam gantry angles was studied. Results: For a typical lung SBRT treatment delivery much lower (20–50%) kV imaging doses from the sum of realistic six tomosynthesis acquisitions with each 25–30 degree x-ray source rotation between six treatment beam gantry angles were observed compared to that from a single CBCT image acquisition. Conclusion: This work indicates that the kV imaging in this proposed Limited-angle Intra-fractional Verification (LIVE) System for SBRT Treatments has a negligible imaging dose increase. It is worth to note that the MV imaging dose caused by MV projection acquisition in-between static beams in LIVE can be minimized by restricting the imaging to the target region and reducing the number of projections acquired. For arc treatments, MV imaging acquisition in LIVE does not add additional imaging dose as the MV images are acquired from treatment beams directly during the treatment.« less

  17. Determination of the spatial resolution required for the HEDR dose code. Hanford Environmental Dose Reconstruction Project: Dose code recovery activities, Calculation 007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.; Simpson, J.C.

    1992-12-01

    A series of scoping calculations has been undertaken to evaluate the doses that may have been received by individuals living in the vicinity of the Hanford site. This scoping calculation (Calculation 007) examined the spatial distribution of potential doses resulting from releases in the year 1945. This study builds on the work initiated in the first scoping calculation, of iodine in cow`s milk; the third scoping calculation, which added additional pathways; the fifth calculation, which addressed the uncertainty of the dose estimates at a point; and the sixth calculation, which extrapolated the doses throughout the atmospheric transport domain. A projectionmore » of dose to representative individuals throughout the proposed HEDR atmospheric transport domain was prepared on the basis of the HEDR source term. Addressed in this calculation were the contributions to iodine-131 thyroid dose of infants from (1) air submersion and groundshine external dose, (2) inhalation, (3) ingestion of soil by humans, (4) ingestion of leafy vegetables, (5) ingestion of other vegetables and fruits, (6) ingestion of meat, (7) ingestion of eggs, and (8) ingestion of cows` milk from-Feeding Regime 1 as described in scoping calculation 001.« less

  18. Total-dose radiation effects data for semiconductor devices, volume 1. [radiation resistance of components for the Galileo Project

    NASA Technical Reports Server (NTRS)

    Price, W. E.; Martin, K. E.; Nichols, D. K.; Gauthier, M. K.; Brown, S. F.

    1981-01-01

    Steady-state, total-dose radiation test data are provided in graphic format, for use by electronic designers and other personnel using semiconductor devices in a radiation environment. Data are presented by JPL for various NASA space programs on diodes, bipolar transistors, field effect transistors, silicon-controlled rectifiers, and optical devices. A vendor identification code list is included along with semiconductor device electrical parameter symbols and abbreviations.

  19. A system to track skin dose for neuro-interventional cone-beam computed tomography (CBCT)

    NASA Astrophysics Data System (ADS)

    Vijayan, Sarath; Xiong, Zhenyu; Rudin, Stephen; Bednarek, Daniel R.

    2016-03-01

    The skin-dose tracking system (DTS) provides a color-coded illustration of the cumulative skin-dose distribution on a closely-matching 3D graphic of the patient during fluoroscopic interventions in real-time for immediate feedback to the interventionist. The skin-dose tracking utility of DTS has been extended to include cone-beam computed tomography (CBCT) of neurointerventions. While the DTS was developed to track the entrance skin dose including backscatter, a significant part of the dose in CBCT is contributed by exit primary radiation and scatter due to the many overlapping projections during the rotational scan. The variation of backscatter inside and outside the collimated beam was measured with radiochromic film and a curve was fit to obtain a scatter spread function that could be applied in the DTS. Likewise, the exit dose distribution was measured with radiochromic film for a single projection and a correction factor was determined as a function of path length through the head. Both of these sources of skin dose are added for every projection in the CBCT scan to obtain a total dose mapping over the patient graphic. Results show the backscatter to follow a sigmoidal falloff near the edge of the beam, extending outside the beam as far as 8 cm. The exit dose measured for a cylindrical CTDI phantom was nearly 10 % of the entrance peak skin dose for the central ray. The dose mapping performed by the DTS for a CBCT scan was compared to that measured with radiochromic film and a CTDI-head phantom with good agreement.

  20. Six steps to a successful dose-reduction strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, M.

    1995-03-01

    The increased importance of demonstrating achievement of the ALARA principle has helped produce a proliferation of dose-reduction ideas. Across a company there may be many dose-reduction items being pursued in a variety of areas. However, companies have a limited amount of resource and, therefore, to ensure funding is directed to those items which will produce the most benefit and that all areas apply a common policy, requires the presence of a dose-reduction strategy. Six steps were identified in formulating the dose-reduction strategy for Rolls-Royce and Associates (RRA): (1) collating the ideas; (2) quantitatively evaluating them on a common basis; (3)more » prioritizing the ideas in terms of cost benefit, (4) implementation of the highest priority items; (5) monitoring their success; (6) periodically reviewing the strategy. Inherent in producing the dose-reduction strategy has been a comprehensive dose database and the RRA-developed dose management computer code DOMAIN, which allows prediction of dose rates and dose. The database enabled high task dose items to be identified, assisted in evaluating dose benefits, and monitored dose trends once items had been implemented. The DOMAIN code was used both in quantifying some of the project dose benefits and its results, such as dose contours, used in some of the dose-reduction items themselves. In all, over fifty dose-reduction items were evaluated in the strategy process and the items which will give greatest benefit are being implemented. The strategy has been successful in giving renewed impetus and direction to dose-reduction management.« less

  1. Lens of the eye dose calculation for neuro-interventional procedures and CBCT scans of the head

    NASA Astrophysics Data System (ADS)

    Xiong, Zhenyu; Vijayan, Sarath; Rana, Vijay; Jain, Amit; Rudin, Stephen; Bednarek, Daniel R.

    2016-03-01

    The aim of this work is to develop a method to calculate lens dose for fluoroscopically-guided neuro-interventional procedures and for CBCT scans of the head. EGSnrc Monte Carlo software is used to determine the dose to the lens of the eye for the projection geometry and exposure parameters used in these procedures. This information is provided by a digital CAN bus on the Toshiba Infinix C-Arm system which is saved in a log file by the real-time skin-dose tracking system (DTS) we previously developed. The x-ray beam spectra on this machine were simulated using BEAMnrc. These spectra were compared to those determined by SpekCalc and validated through measured percent-depth-dose (PDD) curves and half-value-layer (HVL) measurements. We simulated CBCT procedures in DOSXYZnrc for a CTDI head phantom and compared the surface dose distribution with that measured with Gafchromic film, and also for an SK150 head phantom and compared the lens dose with that measured with an ionization chamber. Both methods demonstrated good agreement. Organ dose calculated for a simulated neuro-interventional-procedure using DOSXYZnrc with the Zubal CT voxel phantom agreed within 10% with that calculated by PCXMC code for most organs. To calculate the lens dose in a neuro-interventional procedure, we developed a library of normalized lens dose values for different projection angles and kVp's. The total lens dose is then calculated by summing the values over all beam projections and can be included on the DTS report at the end of the procedure.

  2. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  3. Recalculation with SEACAB of the activation by spent fuel neutrons and residual dose originated in the racks replaced at Cofrentes NPP

    NASA Astrophysics Data System (ADS)

    Ortego, Pedro; Rodriguez, Alain; Töre, Candan; Compadre, José Luis de Diego; Quesada, Baltasar Rodriguez; Moreno, Raul Orive

    2017-09-01

    In order to increase the storage capacity of the East Spent Fuel Pool at the Cofrentes NPP, located in Valencia province, Spain, the existing storage stainless steel racks were replaced by a new design of compact borated stainless steel racks allowing a 65% increase in fuel storing capacity. Calculation of the activation of the used racks was successfully performed with the use of MCNP4B code. Additionally the dose rate at contact with a row of racks in standing position and behind a wall of shielding material has been calculated using MCNP4B code as well. These results allowed a preliminary definition of the burnker required for the storage of racks. Recently the activity in the racks has been recalculated with SEACAB system which combines the mesh tally of MCNP codes with the activation code ACAB, applying the rigorous two-step method (R2S) developed at home, benchmarked with FNG irradiation experiments and usually applied in fusion calculations for ITER project.

  4. Early Results from the Advanced Radiation Protection Thick GCR Shielding Project

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Clowdsley, Martha; Slaba, Tony; Heilbronn, Lawrence; Zeitlin, Cary; Kenny, Sean; Crespo, Luis; Giesy, Daniel; Warner, James; McGirl, Natalie; hide

    2017-01-01

    The Advanced Radiation Protection Thick Galactic Cosmic Ray (GCR) Shielding Project leverages experimental and modeling approaches to validate a predicted minimum in the radiation exposure versus shielding depth curve. Preliminary results of space radiation models indicate that a minimum in the dose equivalent versus aluminum shielding thickness may exist in the 20-30 g/cm2 region. For greater shield thickness, dose equivalent increases due to secondary neutron and light particle production. This result goes against the long held belief in the space radiation shielding community that increasing shielding thickness will decrease risk to crew health. A comprehensive modeling effort was undertaken to verify the preliminary modeling results using multiple Monte Carlo and deterministic space radiation transport codes. These results verified the preliminary findings of a minimum and helped drive the design of the experimental component of the project. In first-of-their-kind experiments performed at the NASA Space Radiation Laboratory, neutrons and light ions were measured between large thicknesses of aluminum shielding. Both an upstream and a downstream shield were incorporated into the experiment to represent the radiation environment inside a spacecraft. These measurements are used to validate the Monte Carlo codes and derive uncertainty distributions for exposure estimates behind thick shielding similar to that provided by spacecraft on a Mars mission. Preliminary results for all aspects of the project will be presented.

  5. SU-E-I-37: Eye Lens Dose Reduction From CT Scan Using Organ Based Tube Current Modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Rensselaer Polytechnic Inst., Troy, NY; Liu, T

    Purpose: To investigate the eye lens dose reduction by CT scan with organ based tube current modulation (OBTCM) using GPU Monte Carlo code ARCHER-CT. Methods: 36 X-ray sources and bowtie filters were placed around the patient head with the projection angle interval of 10° for one rotation of CT scan, each projection was simulated respectively. The voxel eye models with high resolution(0.1mm*0.1mm*0.1mm) were used in the simulation and different tube voltage including 80kVp, 100kVp, 120kVp and 140kVp were taken into consideration. Results: The radiation doses to the eye lens increased with the tube voltage raised from 80kVp to 140kVp, andmore » the dose results from 0° (AP) direction are much higher than those from 180° (PA) direction for all the 4 different tube voltage investigated. This 360° projection dose characteristic enables organ based TCM, which can reduce the eye lens dose by more than 55%. Conclusion: As the eye lens belongs to superficial tissues, its radiation dose to external exposure like CT is direction sensitive, and this characteristic feature makes organ based TCM to be an effective way to reduce the eye lens dose, so more clinical use of this technique were recommended. National Nature Science Foundation of China(No.11475047)« less

  6. An accurate model for the computation of the dose of protons in water.

    PubMed

    Embriaco, A; Bellinzona, V E; Fontana, A; Rotondi, A

    2017-06-01

    The accurate and fast calculation of the dose in proton radiation therapy is an essential ingredient for successful treatments. We propose a novel approach with a minimal number of parameters. The approach is based on the exact calculation of the electromagnetic part of the interaction, namely the Molière theory of the multiple Coulomb scattering for the transversal 1D projection and the Bethe-Bloch formula for the longitudinal stopping power profile, including a gaussian energy straggling. To this e.m. contribution the nuclear proton-nucleus interaction is added with a simple two-parameter model. Then, the non gaussian lateral profile is used to calculate the radial dose distribution with a method that assumes the cylindrical symmetry of the distribution. The results, obtained with a fast C++ based computational code called MONET (MOdel of ioN dosE for Therapy), are in very good agreement with the FLUKA MC code, within a few percent in the worst case. This study provides a new tool for fast dose calculation or verification, possibly for clinical use. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Modeling Acute Health Effects of Astronauts from Exposure to Large Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Hu, Shaowen; Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2011-01-01

    In space exploration outside the Earth s geomagnetic field, radiation exposure from solar particle events (SPE) presents a health concern for astronauts, that could impair their performance and result in possible failure of the mission. Acute risks are of special concern during extra-vehicular activities because of the rapid onset of SPE. However, most SPEs will not lead to acute risks but can lead to mission disruption if accurate projection methods are not available. Acute Radiation Sickness (ARS) is a group of clinical syndromes developing acutely (within several seconds to 3 days) after high dose whole-body or significant partial-body ionizing radiation exposures. The manifestation of these syndromes reflects the disturbance of physiological processes of various cellular groups damaged by radiation. Hematopoietic cells, skin, epithelium, intestine, and vascular endothelium are among the most sensitive tissues of human body to ionizing radiation. Most ARS symptoms are directly related to these tissues and other systems (nervous, endocrine, and cardiovascular, etc.) with coupled regulations. Here we report the progress in bio-mathematical models to describe the dose and time-dependent early human responses to ionizing radiation. The responses include lymphocyte depression, granulocyte modulation, fatigue and weakness syndrome, and upper gastrointestinal distress. The modest dose and dose-rates of SPEs are predicted to lead to large sparing of ARS, however detailed experimental data on a range of proton dose-rates for organ doses from 0.5 to 2 Gy is needed to validate the models. We also report on the ARRBOD code that integrates the BRYNTRN and SUMDOSE codes, which are used to estimate the SPE organ doses for astronauts under various space travel scenarios, with our models of ARS. The more recent effort is to provide easy web access to space radiation risk assessment using the ARRBOD code.

  8. Space Radiation Cancer Risk Projections and Uncertainties - 2010

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.

    2011-01-01

    Uncertainties in estimating health risks from galactic cosmic rays greatly limit space mission lengths and potential risk mitigation evaluations. NASA limits astronaut exposures to a 3% risk of exposure-induced death and protects against uncertainties using an assessment of 95% confidence intervals in the projection model. Revisions to this model for lifetime cancer risks from space radiation and new estimates of model uncertainties are described here. We review models of space environments and transport code predictions of organ exposures, and characterize uncertainties in these descriptions. We summarize recent analysis of low linear energy transfer radio-epidemiology data, including revision to Japanese A-bomb survivor dosimetry, longer follow-up of exposed cohorts, and reassessments of dose and dose-rate reduction effectiveness factors. We compare these projections and uncertainties with earlier estimates. Current understanding of radiation quality effects and recent data on factors of relative biological effectiveness and particle track structure are reviewed. Recent radiobiology experiment results provide new information on solid cancer and leukemia risks from heavy ions. We also consider deviations from the paradigm of linearity at low doses of heavy ions motivated by non-targeted effects models. New findings and knowledge are used to revise the NASA risk projection model for space radiation cancer risks.

  9. Monte Carlo determination of the conversion coefficients Hp(3)/Ka in a right cylinder phantom with 'PENELOPE' code. Comparison with 'MCNP' simulations.

    PubMed

    Daures, J; Gouriou, J; Bordy, J M

    2011-03-01

    This work has been performed within the frame of the European Union ORAMED project (Optimisation of RAdiation protection for MEDical staff). The main goal of the project is to improve standards of protection for medical staff for procedures resulting in potentially high exposures and to develop methodologies for better assessing and for reducing, exposures to medical staff. The Work Package WP2 is involved in the development of practical eye-lens dosimetry in interventional radiology. This study is complementary of the part of the ENEA report concerning the calculations with the MCNP-4C code of the conversion factors related to the operational quantity H(p)(3). In this study, a set of energy- and angular-dependent conversion coefficients (H(p)(3)/K(a)), in the newly proposed square cylindrical phantom made of ICRU tissue, have been calculated with the Monte-Carlo code PENELOPE and MCNP5. The H(p)(3) values have been determined in terms of absorbed dose, according to the definition of this quantity, and also with the kerma approximation as formerly reported in ICRU reports. At a low-photon energy (up to 1 MeV), the two results obtained with the two methods are consistent. Nevertheless, large differences are showed at a higher energy. This is mainly due to the lack of electronic equilibrium, especially for small angle incidences. The values of the conversion coefficients obtained with the MCNP-4C code published by ENEA quite agree with the kerma approximation calculations obtained with PENELOPE. We also performed the same calculations with the code MCNP5 with two types of tallies: F6 for kerma approximation and *F8 for estimating the absorbed dose that is, as known, due to secondary electrons. PENELOPE and MCNP5 results agree for the kerma approximation and for the absorbed dose calculation of H(p)(3) and prove that, for photon energies larger than 1 MeV, the transport of the secondary electrons has to be taken into account.

  10. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project: Draft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. Themore » report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.« less

  11. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. Themore » report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.« less

  12. Proton Dose Assessment to the Human Eye Using Monte Carlo N-Particle Transport Code (MCNPX)

    DTIC Science & Technology

    2006-08-01

    current treatments are applied using an infrared diode laser 10 (projecting a spot size of 2-3 mm), used for about 1 minute per exposure. The laser heats...1983. Shultis J, Faw R. An MCNP Primer. Available at: http:// ww2 .mne.ksu.edu/-jks/MCNPprmr.pdf. Accessed 3 January 2006. Stys P, Lopachin R

  13. Data and methods to estimate fetal dose from fluoroscopically guided prophylactic hypogastric artery balloon occlusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solomou, G.; Stratakis, J.; Perisinakis, K.

    Purpose: To provide data for estimation of fetal radiation dose (D{sub F}) from prophylactic hypogastric artery balloon occlusion (HABO) procedures. Methods: The Monte-Carlo-N-particle (MCNP) transport code and mathematical phantoms representing a pregnant patient at the ninth month of gestation were employed. PA, RAO 20° and LAO 20° fluoroscopy projections of left and right internal iliac arteries were simulated. Projection-specific normalized fetal dose (NFD) data were produced for various beam qualities. The effects of projection angle, x-ray field location relative to the fetus, field size, maternal body size, and fetal size on NFD were investigated. Presented NFD values were compared tomore » corresponding values derived using a physical anthropomorphic phantom simulating pregnancy at the third trimester and thermoluminescence dosimeters. Results: NFD did not considerably vary when projection angle was altered by ±5°, whereas it was found to markedly depend on tube voltage, filtration, x-ray field location and size, and maternal body size. Differences in NFD < 7.5% were observed for naturally expected variations in fetal size. A difference of less than 13.5% was observed between NFD values estimated by MCNP and direct measurements. Conclusions: Data and methods provided allow for reliable estimation of radiation burden to the fetus from HABO.« less

  14. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  15. Determination of dose distributions and parameter sensitivity. Hanford Environmental Dose Reconstruction Project; dose code recovery activities; Calculation 005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.; Farris, W.T.; Simpson, J.C.

    1992-12-01

    A series of scoping calculations has been undertaken to evaluate the absolute and relative contribution of different radionuclides and exposure pathways to doses that may have been received by individuals living in the vicinity of the Hanford site. This scoping calculation (Calculation 005) examined the contributions of numerous parameters to the uncertainty distribution of doses calculated for environmental exposures and accumulation in foods. This study builds on the work initiated in the first scoping study of iodine in cow`s milk and the third scoping study, which added additional pathways. Addressed in this calculation were the contributions to thyroid dose ofmore » infants from (1) air submersion and groundshine external dose, (2) inhalation, (3) ingestion of soil by humans, (4) ingestion of leafy vegetables, (5) ingestion of other vegetables and fruits, (6) ingestion of meat, (7) ingestion of eggs, and (8) ingestion of cows` milk from Feeding Regime 1 as described in Calculation 001.« less

  16. INDOS: conversational computer codes to implement ICRP-10-10A models for estimation of internal radiation dose to man

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Killough, G.G.; Rohwer, P.S.

    1974-03-01

    INDOS1, INDOS2, and INDOS3 (the INDOS codes) are conversational FORTRAN IV programs, implemented for use in time-sharing mode on the ORNL PDP-10 System. These codes use ICRP10-10A models to estimate the radiation dose to an organ of the body of Reference Man resulting from the ingestion or inhalation of any one of various radionuclides. Two patterns of intake are simulated: intakes at discrete times and continuous intake at a constant rate. The IND0S codes provide tabular output of dose rate and dose vs time, graphical output of dose vs time, and punched-card output of organ burden and dose vs time.more » The models of internal dose calculation are discussed and instructions for the use of the INDOS codes are provided. The INDOS codes are available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, P. O. Box X, Oak Ridge, Tennessee 37830. (auth)« less

  17. TEPC measurements in commercial aircraft.

    PubMed

    Taylor, G C; Bentley, R D; Horwood, N A; Hunter, R; Iles, R H; Jones, J B L; Powell, D; Thomas, D J

    2004-01-01

    The collaborative project involving the Mullard Space Science Laboratory (MSSL), Virgin Atlantic Airways (VAA), the UK Civil Aviation Authority (CAA) and the UK National Physical Laboratory (NPL) has been performing tissue-equivalent proportional counter measurements of cosmic ray doses in commercial aircraft since January 2000. In that time data have been recorded on over 700 flights, including over 150 flights with Air New Zealand (ANZ). This substantial set of data from the southern hemisphere is an ideal complement to the London-based measurements performed primarily on VAA flights. Although some ANZ data remains to be analysed, dose information from 111 flights has been compared with the CARI and EPCARD computer codes. Overall, the agreement between the measurements and EPCARD was excellent (within 1% for the total ambient dose equivalent), and the difference in the total effective doses predicted by EPCARD and CARI was <5%.

  18. Epp: A C++ EGSnrc user code for x-ray imaging and scattering simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lippuner, Jonas; Elbakri, Idris A.; Cui Congwu

    2011-03-15

    Purpose: Easy particle propagation (Epp) is a user code for the EGSnrc code package based on the C++ class library egspp. A main feature of egspp (and Epp) is the ability to use analytical objects to construct simulation geometries. The authors developed Epp to facilitate the simulation of x-ray imaging geometries, especially in the case of scatter studies. While direct use of egspp requires knowledge of C++, Epp requires no programming experience. Methods: Epp's features include calculation of dose deposited in a voxelized phantom and photon propagation to a user-defined imaging plane. Projection images of primary, single Rayleigh scattered, singlemore » Compton scattered, and multiple scattered photons may be generated. Epp input files can be nested, allowing for the construction of complex simulation geometries from more basic components. To demonstrate the imaging features of Epp, the authors simulate 38 keV x rays from a point source propagating through a water cylinder 12 cm in diameter, using both analytical and voxelized representations of the cylinder. The simulation generates projection images of primary and scattered photons at a user-defined imaging plane. The authors also simulate dose scoring in the voxelized version of the phantom in both Epp and DOSXYZnrc and examine the accuracy of Epp using the Kawrakow-Fippel test. Results: The results of the imaging simulations with Epp using voxelized and analytical descriptions of the water cylinder agree within 1%. The results of the Kawrakow-Fippel test suggest good agreement between Epp and DOSXYZnrc. Conclusions: Epp provides the user with useful features, including the ability to build complex geometries from simpler ones and the ability to generate images of scattered and primary photons. There is no inherent computational time saving arising from Epp, except for those arising from egspp's ability to use analytical representations of simulation geometries. Epp agrees with DOSXYZnrc in dose calculation, since they are both based on the well-validated standard EGSnrc radiation transport physics model.« less

  19. Common Errors in the Calculation of Aircrew Doses from Cosmic Rays

    NASA Astrophysics Data System (ADS)

    O'Brien, Keran; Felsberger, Ernst; Kindl, Peter

    2010-05-01

    Radiation doses to air crew are calculated using flight codes. Flight codes integrate dose rates over the aircraft flight path, which were calculated by transport codes or obtained by measurements from take off at a specific airport to landing at another. The dose rates are stored in various ways, such as by latitude and longitude, or in terms of the geomagnetic vertical cutoff. The transport codes are generally quite satisfactory, but the treatment of the boundary conditions is frequently incorrect. Both the treatment of solar modulation and of the effect of the geomagnetic field are often defective, leading to the systematic overestimate of the crew doses.

  20. HADOC: a computer code for calculation of external and inhalation doses from acute radionuclide releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strenge, D.L.; Peloquin, R.A.

    The computer code HADOC (Hanford Acute Dose Calculations) is described and instructions for its use are presented. The code calculates external dose from air submersion and inhalation doses following acute radionuclide releases. Atmospheric dispersion is calculated using the Hanford model with options to determine maximum conditions. Building wake effects and terrain variation may also be considered. Doses are calculated using dose conversion factor supplied in a data library. Doses are reported for one and fifty year dose commitment periods for the maximum individual and the regional population (within 50 miles). The fractional contribution to dose by radionuclide and exposure modemore » are also printed if requested.« less

  1. Evaluation Of Shielding Efficacy Of A Ferrite Containing Ceramic Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verst, C.

    2015-10-12

    The shielding evaluation of the ferrite based Mitsuishi ceramic material has produced for several radiation sources and possible shielding sizes comparative dose attenuation measurements and simulated projections. High resolution gamma spectroscopy provided uncollided and scattered photon spectra at three energies, confirming theoretical estimates of the ceramic’s mass attenuation coefficient, μ/ρ. High level irradiation experiments were performed using Co-60, Cs-137, and Cf-252 sources to measure penetrating dose rates through steel, lead, concrete, and the provided ceramic slabs. The results were used to validate the radiation transport code MCNP6 which was then used to generate dose rate attenuation curves as a functionmore » of shielding material, thickness, and mass for photons and neutrons ranging in energy from 200 keV to 2 MeV.« less

  2. The MONET code for the evaluation of the dose in hadrontherapy

    NASA Astrophysics Data System (ADS)

    Embriaco, A.

    2018-01-01

    The MONET is a code for the computation of the 3D dose distribution for protons in water. For the lateral profile, MONET is based on the Molière theory of multiple Coulomb scattering. To take into account also the nuclear interactions, we add to this theory a Cauchy-Lorentz function, where the two parameters are obtained by a fit to a FLUKA simulation. We have implemented the Papoulis algorithm for the passage from the projected to a 2D lateral distribution. For the longitudinal profile, we have implemented a new calculation of the energy loss that is in good agreement with simulations. The inclusion of the straggling is based on the convolution of energy loss with a Gaussian function. In order to complete the longitudinal profile, also the nuclear contributions are included using a linear parametrization. The total dose profile is calculated in a 3D mesh by evaluating at each depth the 2D lateral distributions and by scaling them at the value of the energy deposition. We have compared MONET with FLUKA in two cases: a single Gaussian beam and a lateral scan. In both cases, we have obtained a good agreement for different energies of protons in water.

  3. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  4. Space Radiation Cancer Risks and Uncertainties for Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Badhwar, G. D.; Saganti, P. B.; Dicello, J. F.

    2001-01-01

    Projecting cancer risks from exposure to space radiation is highly uncertain because of the absence of data for humans and because of the limited radiobiology data available for estimating late effects from the high-energy and charge (HZE) ions present in the galactic cosmic rays (GCR). Cancer risk projections involve many biological and physical factors, each of which has a differential range of uncertainty due to the lack of data and knowledge. We discuss an uncertainty assessment within the linear-additivity model using the approach of Monte Carlo sampling from subjective error distributions that represent the lack of knowledge in each factor to quantify the overall uncertainty in risk projections. Calculations are performed using the space radiation environment and transport codes for several Mars mission scenarios. This approach leads to estimates of the uncertainties in cancer risk projections of 400-600% for a Mars mission. The uncertainties in the quality factors are dominant. Using safety standards developed for low-Earth orbit, long-term space missions (>90 days) outside the Earth's magnetic field are currently unacceptable if the confidence levels in risk projections are considered. Because GCR exposures involve multiple particle or delta-ray tracks per cellular array, our results suggest that the shape of the dose response at low dose rates may be an additional uncertainty for estimating space radiation risks.

  5. Monte Carlo dose calculations of beta-emitting sources for intravascular brachytherapy: a comparison between EGS4, EGSnrc, and MCNP.

    PubMed

    Wang, R; Li, X A

    2001-02-01

    The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detilleux, Michel; Centner, Baudouin

    The paper describes different methodologies and tools developed in-house by Tractebel Engineering to facilitate the engineering works to be carried out especially in the frame of decommissioning projects. Three examples of tools with their corresponding results are presented: - The LLWAA-DECOM code, a software developed for the radiological characterization of contaminated systems and equipment. The code constitutes a specific module of more general software that was originally developed to characterize radioactive waste streams in order to be able to declare the radiological inventory of critical nuclides, in particular difficult-to-measure radionuclides, to the Authorities. In the case of LLWAA-DECOM, deposited activitiesmore » inside contaminated equipment (piping, tanks, heat exchangers...) and scaling factors between nuclides, at any given time of the decommissioning time schedule, are calculated on the basis of physical characteristics of the systems and of operational parameters of the nuclear power plant. This methodology was applied to assess decommissioning costs of Belgian NPPs, to characterize the primary system of Trino NPP in Italy, to characterize the equipment of miscellaneous circuits of Ignalina NPP and of Kozloduy unit 1 and, to calculate remaining dose rates around equipment in the frame of the preparation of decommissioning activities; - The VISIMODELLER tool, a user friendly CAD interface developed to ease the introduction of lay-out areas in a software named VISIPLAN. VISIPLAN is a 3D dose rate assessment tool for ALARA work planning, developed by the Belgian Nuclear Research Centre SCK.CEN. Both softwares were used for projects such as the steam generators replacements in Belgian NPPs or the preparation of the decommissioning of units 1 and 2 of Kozloduy NPP; - The DBS software, a software developed to manage the different kinds of activities that are part of the general time schedule of a decommissioning project. For each activity, when relevant, algorithms allow to estimate, on the basis of local inputs, radiological exposures of the operators (collective and individual doses), production of primary, secondary and tertiary waste and their characterization, production of conditioned waste, release of effluents,... and enable the calculation and the presentation (histograms) of the global results for all activities together. An example of application in the frame of the Ignalina decommissioning project is given. (authors)« less

  7. The influence of patient size on dose conversion coefficients: a hybrid phantom study for adult cardiac catheterization

    NASA Astrophysics Data System (ADS)

    Johnson, Perry; Lee, Choonsik; Johnson, Kevin; Siragusa, Daniel; Bolch, Wesley E.

    2009-06-01

    In this study, the influence of patient size on organ and effective dose conversion coefficients (DCCs) was investigated for a representative interventional fluoroscopic procedure—cardiac catheterization. The study was performed using hybrid phantoms representing an underweight, average and overweight American adult male. Reference body sizes were determined using the NHANES III database and parameterized based on standing height and total body mass. Organ and effective dose conversion coefficients were calculated for anterior-posterior, posterior-anterior, left anterior oblique and right anterior oblique projections using the Monte Carlo code MCNPX 2.5.0 with the metric dose area product being used as the normalization factor. Results show body size to have a clear influence on DCCs which increased noticeably when body size decreased. It was also shown that if patient size is neglected when choosing a DCC, the organ and effective dose will be underestimated to an underweight patient and will be overestimated to an underweight patient, with errors as large as 113% for certain projections. Results were further compared with those published for a KTMAN-2 Korean patient-specific tomographic phantom. The published DCCs aligned best with the hybrid phantom which most closely matched in overall body size. These results highlighted the need for and the advantages of phantom-patient matching, and it is recommended that hybrid phantoms be used to create a more diverse library of patient-dependent anthropomorphic phantoms for medical dose reconstruction.

  8. Color-Coded Prefilled Medication Syringes Decrease Time to Delivery and Dosing Error in Simulated Emergency Department Pediatric Resuscitations.

    PubMed

    Moreira, Maria E; Hernandez, Caleb; Stevens, Allen D; Jones, Seth; Sande, Margaret; Blumen, Jason R; Hopkins, Emily; Bakes, Katherine; Haukoos, Jason S

    2015-08-01

    The Institute of Medicine has called on the US health care system to identify and reduce medical errors. Unfortunately, medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients when dosing requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national health care priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared with conventional medication administration, in simulated pediatric emergency department (ED) resuscitation scenarios. We performed a prospective, block-randomized, crossover study in which 10 emergency physician and nurse teams managed 2 simulated pediatric arrest scenarios in situ, using either prefilled, color-coded syringes (intervention) or conventional drug administration methods (control). The ED resuscitation room and the intravenous medication port were video recorded during the simulations. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the conventional and color-coded delivery groups was 47 seconds (95% confidence interval [CI] 40 to 53 seconds) and 19 seconds (95% CI 18 to 20 seconds), respectively (difference=27 seconds; 95% CI 21 to 33 seconds). With the conventional method, 118 doses were administered, with 20 critical dosing errors (17%); with the color-coded method, 123 doses were administered, with 0 critical dosing errors (difference=17%; 95% CI 4% to 30%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by emergency physician and nurse teams during simulated pediatric ED resuscitations. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  9. Verification of BWR Turbine Skyshine Dose with the MCNP5 Code Based on an Experiment Made at SHIMANE Nuclear Power Station

    NASA Astrophysics Data System (ADS)

    Tayama, Ryuichi; Wakasugi, Kenichi; Kawanaka, Ikunori; Kadota, Yoshinobu; Murakami, Yasuhiro

    We measured the skyshine dose from turbine buildings at Shimane Nuclear Power Station Unit 1 (NS-1) and Unit 2 (NS-2), and then compared it with the dose calculated with the Monte Carlo transport code MCNP5. The skyshine dose values calculated with the MCNP5 code agreed with the experimental data within a factor of 2.8, when the roof of the turbine building was precisely modeled. We concluded that our MCNP5 calculation was valid for BWR turbine skyshine dose evaluation.

  10. Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’

    NASA Astrophysics Data System (ADS)

    Yegin, Gultekin

    2018-02-01

    In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.

  11. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  12. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebe, A.; Leveling, A.; Lu, T.

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay gamma-quanta by the residuals in the activated structures and scoring the prompt doses of these gamma-quanta at arbitrary distances frommore » those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and showed a good agreement. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.« less

  13. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    NASA Astrophysics Data System (ADS)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  14. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN.

    PubMed

    Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E

    2013-10-21

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  15. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN

    NASA Astrophysics Data System (ADS)

    Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.

    2013-10-01

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  16. Gamma irradiator dose mapping simulation using the MCNP code and benchmarking with dosimetry.

    PubMed

    Sohrabpour, M; Hassanzadeh, M; Shahriari, M; Sharifzadeh, M

    2002-10-01

    The Monte Carlo transport code, MCNP, has been applied in simulating dose rate distribution in the IR-136 gamma irradiator system. Isodose curves, cumulative dose values, and system design data such as throughputs, over-dose-ratios, and efficiencies have been simulated as functions of product density. Simulated isodose curves, and cumulative dose values were compared with dosimetry values obtained using polymethyle-methacrylate, Fricke, ethanol-chlorobenzene, and potassium dichromate dosimeters. The produced system design data were also found to agree quite favorably with those of the system manufacturer's data. MCNP has thus been found to be an effective transport code for handling of various dose mapping excercises for gamma irradiators.

  17. Test of 3D CT reconstructions by EM + TV algorithm from undersampled data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evseev, Ivan; Ahmann, Francielle; Silva, Hamilton P. da

    2013-05-06

    Computerized tomography (CT) plays an important role in medical imaging for diagnosis and therapy. However, CT imaging is connected with ionization radiation exposure of patients. Therefore, the dose reduction is an essential issue in CT. In 2011, the Expectation Maximization and Total Variation Based Model for CT Reconstruction (EM+TV) was proposed. This method can reconstruct a better image using less CT projections in comparison with the usual filtered back projection (FBP) technique. Thus, it could significantly reduce the overall dose of radiation in CT. This work reports the results of an independent numerical simulation for cone beam CT geometry withmore » alternative virtual phantoms. As in the original report, the 3D CT images of 128 Multiplication-Sign 128 Multiplication-Sign 128 virtual phantoms were reconstructed. It was not possible to implement phantoms with lager dimensions because of the slowness of code execution even by the CORE i7 CPU.« less

  18. Color-coded prefilled medication syringes decrease time to delivery and dosing errors in simulated prehospital pediatric resuscitations: A randomized crossover trial☆, ☆

    PubMed Central

    Stevens, Allen D.; Hernandez, Caleb; Jones, Seth; Moreira, Maria E.; Blumen, Jason R.; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S.

    2016-01-01

    Background Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. Methods We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded-syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Results Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28–39) seconds and 42 (95% CI: 36–51) seconds, respectively (difference = 9 [95% CI: 4–14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference = 39%, 95% CI: 13–61%). Conclusions A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. PMID:26247145

  19. Color-coded prefilled medication syringes decrease time to delivery and dosing errors in simulated prehospital pediatric resuscitations: A randomized crossover trial.

    PubMed

    Stevens, Allen D; Hernandez, Caleb; Jones, Seth; Moreira, Maria E; Blumen, Jason R; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S

    2015-11-01

    Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28-39) seconds and 42 (95% CI: 36-51) seconds, respectively (difference=9 [95% CI: 4-14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference=39%, 95% CI: 13-61%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. SU-E-I-42: Normalized Embryo/fetus Doses for Fluoroscopically Guided Pacemaker Implantation Procedures Calculated Using a Monte Carlo Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damilakis, J; Stratakis, J; Solomou, G

    Purpose: It is well known that pacemaker implantation is sometimes needed in pregnant patients with symptomatic bradycardia. To our knowledge, there is no reported experience regarding radiation doses to the unborn child resulting from fluoroscopy during pacemaker implantation. The purpose of the current study was to develop a method for estimating embryo/fetus dose from fluoroscopically guided pacemaker implantation procedures performed on pregnant patients during all trimesters of gestation. Methods: The Monte Carlo N-Particle (MCNP) radiation transport code was employed in this study. Three mathematical anthropomorphic phantoms representing the average pregnant patient at the first, second and third trimesters of gestationmore » were generated using Bodybuilder software (White Rock science, White Rock, NM). The normalized embryo/fetus dose from the posteroanterior (PA), the 30° left-anterior oblique (LAO) and the 30° right-anterior oblique (RAO) projections were calculated for a wide range of kVp (50–120 kVp) and total filtration values (2.5–9.0 mm Al). Results: The results consist of radiation doses normalized to a) entrance skin dose (ESD) and b) dose area product (DAP) so that the dose to the unborn child from any fluoroscopic technique and x-ray device used can be calculated. ESD normalized doses ranged from 0.008 (PA, first trimester) to 2.519 μGy/mGy (RAO, third trimester). DAP normalized doses ranged from 0.051 (PA, first trimester) to 12.852 μGy/Gycm2 (RAO, third trimester). Conclusion: Embryo/fetus doses from fluoroscopically guided pacemaker implantation procedures performed on pregnant patients during all stages of gestation can be estimated using the method developed in this study. This study was supported by the Greek Ministry of Education and Religious Affairs, General Secretariat for Research and Technology, Operational Program ‘Education and Lifelong Learning’, ARISTIA (Research project: CONCERT)« less

  1. Using the Monte Carlo technique to calculate dose conversion coefficients for medical professionals in interventional radiology

    NASA Astrophysics Data System (ADS)

    Santos, W. S.; Carvalho, A. B., Jr.; Hunt, J. G.; Maia, A. F.

    2014-02-01

    The objective of this study was to estimate doses in the physician and the nurse assistant at different positions during interventional radiology procedures. In this study, effective doses obtained for the physician and at points occupied by other workers were normalised by air kerma-area product (KAP). The simulations were performed for two X-ray spectra (70 kVp and 87 kVp) using the radiation transport code MCNPX (version 2.7.0), and a pair of anthropomorphic voxel phantoms (MASH/FASH) used to represent both the patient and the medical professional at positions from 7 cm to 47 cm from the patient. The X-ray tube was represented by a point source positioned in the anterior posterior (AP) and posterior anterior (PA) projections. The CC can be useful to calculate effective doses, which in turn are related to stochastic effects. With the knowledge of the values of CCs and KAP measured in an X-ray equipment, at a similar exposure, medical professionals will be able to know their own effective dose.

  2. Comparison of EGS4 and MCNP Monte Carlo codes when calculating radiotherapy depth doses.

    PubMed

    Love, P A; Lewis, D G; Al-Affan, I A; Smith, C W

    1998-05-01

    The Monte Carlo codes EGS4 and MCNP have been compared when calculating radiotherapy depth doses in water. The aims of the work were to study (i) the differences between calculated depth doses in water for a range of monoenergetic photon energies and (ii) the relative efficiency of the two codes for different electron transport energy cut-offs. The depth doses from the two codes agree with each other within the statistical uncertainties of the calculations (1-2%). The relative depth doses also agree with data tabulated in the British Journal of Radiology Supplement 25. A discrepancy in the dose build-up region may by attributed to the different electron transport algorithims used by EGS4 and MCNP. This discrepancy is considerably reduced when the improved electron transport routines are used in the latest (4B) version of MCNP. Timing calculations show that EGS4 is at least 50% faster than MCNP for the geometries used in the simulations.

  3. Evaluation of the medical and occupational shielding in cerebral angiography using Monte Carlo simulations and virtual anthropomorphic phantoms

    NASA Astrophysics Data System (ADS)

    Santos, William S.; Neves, Lucio P.; Perini, Ana P.; Caldas, Linda V. E.; Maia, Ana F.

    2015-12-01

    Cerebral angiography exams may provide valuable diagnostic information for the patients with suspect of cerebral diseases, but it may also deliver high doses of radiation to the patients and medical staff. In order to evaluate the medical and occupational expositions from different irradiation conditions, Monte Carlo (MC) simulations were employed. Virtual anthropomorphic phantoms (MASH) were used to represent the patient and the physician inside a typical fluoroscopy room, also simulated in details, incorporated in the MCNPX 2.7.0 MC code. The evaluation was carried out by means of dose conversion coefficients (CCs) for equivalent (H) and effective (E) doses normalized by the air kerma-area product (KAP). The CCs for the surface entrance dose of the patient (ESD) and equivalent dose for the eyes of the medical staff were determined, because CA exams present higher risks for those organs. The tube voltage was 80 kVp, and Al filters with thicknesses of 2.5 mm, 3.5 mm and 4.0 mm were positioned in the beams. Two projections were simulated: posterior-anterior (PA) and right-lateral (RLAT). In all situations there was an increase of the CC values with the increase of the Al filtration. The highest dose was obtained for a RLAT projection with a 4.0 mm Al filter. In this projection, the ESD/KAP and E/KAP values to patient were 11 (14%) mGy/Gy cm2 and 0.12 (0.1%) mSv/Gy cm2, respectively. For the physician, the use of shield lead glass suspended and lead curtain attached to the surgical table resulted in a significant reduction of the CCs. The use of MC simulations proved to be a very important tool in radiation protection dosimetry, and specifically in this study several parameters could be evaluated, which would not be possible experimentally.

  4. Dose Modeling Evaluations and Technical Support Document For the Authorized Limits Request for the DOE-Owned Property Outside the Limited Area, Paducah Gaseous Diffusion Plant Paducah, Kentucky

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boerner, A. J.; Maldonado, D. G.; Hansen, Tom

    2012-09-01

    Environmental assessments and remediation activities are being conducted by the U.S. Department of Energy (DOE) at the Paducah Gaseous Diffusion Plant (PGDP), Paducah, Kentucky. The Oak Ridge Institute for Science and Education (ORISE), a DOE prime contractor, was contracted by the DOE Portsmouth/Paducah Project Office (DOE-PPPO) to conduct radiation dose modeling analyses and derive single radionuclide soil guidelines (soil guidelines) in support of the derivation of Authorized Limits (ALs) for 'DOE-Owned Property Outside the Limited Area' ('Property') at the PGDP. The ORISE evaluation specifically included the area identified by DOE restricted area postings (public use access restrictions) and areas licensedmore » by DOE to the West Kentucky Wildlife Management Area (WKWMA). The licensed areas are available without restriction to the general public for a variety of (primarily) recreational uses. Relevant receptors impacting current and reasonably anticipated future use activities were evaluated. In support of soil guideline derivation, a Conceptual Site Model (CSM) was developed. The CSM listed radiation and contamination sources, release mechanisms, transport media, representative exposure pathways from residual radioactivity, and a total of three receptors (under present and future use scenarios). Plausible receptors included a Resident Farmer, Recreational User, and Wildlife Worker. single radionuclide soil guidelines (outputs specified by the software modeling code) were generated for three receptors and thirteen targeted radionuclides. These soil guidelines were based on satisfying the project dose constraints. For comparison, soil guidelines applicable to the basic radiation public dose limit of 100 mrem/yr were generated. Single radionuclide soil guidelines from the most limiting (restrictive) receptor based on a target dose constraint of 25 mrem/yr were then rounded and identified as the derived soil guidelines. An additional evaluation using the derived soil guidelines as inputs into the code was also performed to determine the maximum (peak) dose for all receptors. This report contains the technical basis in support of the DOE?s derivation of ALs for the 'Property.' A complete description of the methodology, including an assessment of the input parameters, model inputs, and results is provided in this report. This report also provides initial recommendations on applying the derived soil guidelines.« less

  5. SU-F-J-99: Dose Accumulation and Evaluation in Lung SBRT Among All Phases of Respiration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azcona, JD; Barbes, B; Aristu, J

    Purpose: To calculate the total planning dose on lung tumors (GTV) by accumulating the dose received in all respiration phases. Methods: A patient 4D planning CT (phase-binned, from a Siemens Somatom CT) was used to locate the GTV of a lung tumor in all respiratory phases with Pinnacle (v9.10). GTV contours defined in all phases were projected to the reference phase, where the ITV was defined. Centroids were calculated for all the GTV projections. No deformation or rotation was taken into account. The only GTV contour as defined in the reference phase was voxelized to track each voxel individually. Wemore » accumulated the absorbed dose in different phases on each voxel. A 3DCRT and a VMAT plan were designed on the reference phase fulfilling the ITV dosimetric requirements, using the 10MV FFF photon model from an Elekta Versa linac. ITV-to-PTV margins were set to 5mm. In-house developed MATLAB code was used for tumor voxeling and dose accumulation, assuming that the dose distribution planned in the reference phase behaved as a “dose-cloud” during patient breathing. Results: We tested the method on a patient 4DCT set of images exhibiting limited tumor motion (<5mm). For the 3DCRT plan, D95 was calculated for the GTV with motion and for the ITV, showing an agreement of 0.04%. For the VMAT plan, we calculated the D95 for every phase as if the GTV in that phase had received the whole treatment. Differences in D95 for all phases are within 1%, and estimate the potential interplay effect during delivery. Conclusion: A method for dose accumulation and assessment was developed that can compare GTV motion with ITV dosage, and estimate the potential interplay effect for VMAT plans. Work in progress includes the incorporation of deformable image registration and 4D CBCT dose calculation for dose reconstruction and assessment during treatment.« less

  6. Linear energy transfer in water phantom within SHIELD-HIT transport code

    NASA Astrophysics Data System (ADS)

    Ergun, A.; Sobolevsky, N.; Botvina, A. S.; Buyukcizmeci, N.; Latysheva, L.; Ogul, R.

    2017-02-01

    The effect of irradiation in tissue is important in hadron therapy for the dose measurement and treatment planning. This biological effect is defined by an equivalent dose H which depends on the Linear Energy Transfer (LET). Usually, H can be expressed in terms of the absorbed dose D and the quality factor K of the radiation under consideration. In literature, various types of transport codes have been used for modeling and simulation of the interaction of the beams of protons and heavier ions with tissue-equivalent materials. In this presentation we used SHIELD-HIT code to simulate decomposition of the absorbed dose by LET in water for 16O beams. A more detailed description of capabilities of the SHIELD-HIT code can be found in the literature.

  7. The DOPEX code: An application of the method of steepest descent to laminated-shield-weight optimization with several constraints

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.

    1972-01-01

    A two- or three-constraint, two-dimensional radiation shield weight optimization procedure and a computer program, DOPEX, is described. The DOPEX code uses the steepest descent method to alter a set of initial (input) thicknesses for a shield configuration to achieve a minimum weight while simultaneously satisfying dose constaints. The code assumes an exponential dose-shield thickness relation with parameters specified by the user. The code also assumes that dose rates in each principal direction are dependent only on thicknesses in that direction. Code input instructions, FORTRAN 4 listing, and a sample problem are given. Typical computer time required to optimize a seven-layer shield is about 0.1 minute on an IBM 7094-2.

  8. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    PubMed

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  9. Comet assay in reconstructed 3D human epidermal skin models--investigation of intra- and inter-laboratory reproducibility with coded chemicals.

    PubMed

    Reus, Astrid A; Reisinger, Kerstin; Downs, Thomas R; Carr, Gregory J; Zeller, Andreas; Corvi, Raffaella; Krul, Cyrille A M; Pfuhler, Stefan

    2013-11-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure.

  10. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    PubMed Central

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  11. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  12. Rigorous-two-Steps scheme of TRIPOLI-4® Monte Carlo code validation for shutdown dose rate calculation

    NASA Astrophysics Data System (ADS)

    Jaboulay, Jean-Charles; Brun, Emeric; Hugot, François-Xavier; Huynh, Tan-Dat; Malouch, Fadhel; Mancusi, Davide; Tsilanizara, Aime

    2017-09-01

    After fission or fusion reactor shutdown the activated structure emits decay photons. For maintenance operations the radiation dose map must be established in the reactor building. Several calculation schemes have been developed to calculate the shutdown dose rate. These schemes are widely developed in fusion application and more precisely for the ITER tokamak. This paper presents the rigorous-two-steps scheme implemented at CEA. It is based on the TRIPOLI-4® Monte Carlo code and the inventory code MENDEL. The ITER shutdown dose rate benchmark has been carried out, results are in a good agreement with the other participant.

  13. CERISE, a French radioprotection code, to assess the radiological impact and acceptance criteria of installations for material handling, and recycling or disposal of very low-level radioactive waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santucci, P.; Guetat, P.

    1993-12-31

    This document describes the code CERISE, Code d`Evaluations Radiologiques Individuelles pour des Situations en Enterprise et dans l`Environnement. This code has been developed in the frame of European studies to establish acceptance criteria of very low-level radioactive waste and materials. This code is written in Fortran and runs on PC. It calculates doses received by the different pathways: external exposure, ingestion, inhalation and skin contamination. Twenty basic scenarios are already elaborated, which have been determined from previous studies. Calculations establish the relation between surface, specific and/or total activities, and doses. Results can be expressed as doses for an average activitymore » unit, or as average activity limits for a set of reference doses (defined for each scenario analyzed). In this last case, the minimal activity values and the corresponding limiting scenarios, are selected and summarized in a final table.« less

  14. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  15. Simulations of MATROSHKA experiments at ISS using PHITS

    NASA Astrophysics Data System (ADS)

    Puchalska, Monika; Sihver, L.; Sato, T.; Berger, T.; Reitz, G.

    Concerns about the biological effects of space radiation are increasing rapidly due to the per-spective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a prepara-tion for these long duration space missions it is important to ensure an excellent capability to evaluate the impact of space radiation on human health in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radi-ation load on the personnel both inside and outside the space vehicles and certify that organ and tissue equivalent doses can be simulated as accurate as possible. In this paper we will present simulations using the three-dimensional Monte Carlo Particle and Heavy Ion Transport code System (PHITS) of long term dose measurements performed with the ESA supported ex-periment MATROSHKA (MTR), which is an anthropomorphic phantom containing over 6000 radiation detectors, mimicking a human head and torso. The MTR experiment, led by the German Aerospace Center (DLR), was launched in January 2004 and has measured the ab-sorbed dose from space radiation both inside and outside the ISS. In this paper preliminary comparisons of measured and calculated dose and organ doses in the MTR located outside the ISS will be presented. The results confirm previous calculations and measurements which indicate that PHITS is a suitable tool for estimations of dose received from cosmic radiation and when performing shielding design studies of spacecraft. Acknowledgement: The research leading to these results has received funding from the Euro-pean Commission in the frame of the FP7 HAMLET project (Project 218817).

  16. Sci—Thur AM: YIS - 03: irtGPUMCD: a new GPU-calculated dosimetry code for {sup 177}Lu-octreotate radionuclide therapy of neuroendocrine tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montégiani, Jean-François; Gaudin, Émilie; Després, Philippe

    2014-08-15

    In peptide receptor radionuclide therapy (PRRT), huge inter-patient variability in absorbed radiation doses per administered activity mandates the utilization of individualized dosimetry to evaluate therapeutic efficacy and toxicity. We created a reliable GPU-calculated dosimetry code (irtGPUMCD) and assessed {sup 177}Lu-octreotate renal dosimetry in eight patients (4 cycles of approximately 7.4 GBq). irtGPUMCD was derived from a brachytherapy dosimetry code (bGPUMCD), which was adapted to {sup 177}Lu PRRT dosimetry. Serial quantitative single-photon emission computed tomography (SPECT) images were obtained from three SPECT/CT acquisitions performed at 4, 24 and 72 hours after {sup 177}Lu-octreotate administration, and registered with non-rigid deformation of CTmore » volumes, to obtain {sup 177}Lu-octreotate 4D quantitative biodistribution. Local energy deposition from the β disintegrations was assumed. Using Monte Carlo gamma photon transportation, irtGPUMCD computed dose rate at each time point. Average kidney absorbed dose was obtained from 1-cm{sup 3} VOI dose rate samples on each cortex, subjected to a biexponential curve fit. Integration of the latter time-dose rate curve yielded the renal absorbed dose. The mean renal dose per administered activity was 0.48 ± 0.13 Gy/GBq (range: 0.30–0.71 Gy/GBq). Comparison to another PRRT dosimetry code (VRAK: Voxelized Registration and Kinetics) showed fair accordance with irtGPUMCD (11.4 ± 6.8 %, range: 3.3–26.2%). These results suggest the possibility to use the irtGPUMCD code in order to personalize administered activity in PRRT. This could allow improving clinical outcomes by maximizing per-cycle tumor doses, without exceeding the tolerable renal dose.« less

  17. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    2016-12-01

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  18. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  19. Poster — Thur Eve — 14: Improving Tissue Segmentation for Monte Carlo Dose Calculation using DECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Salvio, A.; Bedwani, S.; Carrier, J-F.

    2014-08-15

    Purpose: To improve Monte Carlo dose calculation accuracy through a new tissue segmentation technique with dual energy CT (DECT). Methods: Electron density (ED) and effective atomic number (EAN) can be extracted directly from DECT data with a stoichiometric calibration method. Images are acquired with Monte Carlo CT projections using the user code egs-cbct and reconstructed using an FDK backprojection algorithm. Calibration is performed using projections of a numerical RMI phantom. A weighted parameter algorithm then uses both EAN and ED to assign materials to voxels from DECT simulated images. This new method is compared to a standard tissue characterization frommore » single energy CT (SECT) data using a segmented calibrated Hounsfield unit (HU) to ED curve. Both methods are compared to the reference numerical head phantom. Monte Carlo simulations on uniform phantoms of different tissues using dosxyz-nrc show discrepancies in depth-dose distributions. Results: Both SECT and DECT segmentation methods show similar performance assigning soft tissues. Performance is however improved with DECT in regions with higher density, such as bones, where it assigns materials correctly 8% more often than segmentation with SECT, considering the same set of tissues and simulated clinical CT images, i.e. including noise and reconstruction artifacts. Furthermore, Monte Carlo results indicate that kV photon beam depth-dose distributions can double between two tissues of density higher than muscle. Conclusions: A direct acquisition of ED and the added information of EAN with DECT data improves tissue segmentation and increases the accuracy of Monte Carlo dose calculation in kV photon beams.« less

  20. Monte Carlo calculation of dose rate conversion factors for external exposure to photon emitters in soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clovas, A.; Zanthos, S.; Antonopoulos-Domis, M.

    2000-03-01

    The dose rate conversion factors {dot D}{sub CF} (absorbed dose rate in air per unit activity per unit of soil mass, nGy h{sup {minus}1} per Bq kg{sup {minus}1}) are calculated 1 m above ground for photon emitters of natural radionuclides uniformly distributed in the soil. Three Monte Carlo codes are used: (1) The MCNP code of Los Alamos; (2) The GEANT code of CERN; and (3) a Monte Carlo code developed in the Nuclear Technology Laboratory of the Aristotle University of Thessaloniki. The accuracy of the Monte Carlo results is tested by the comparison of the unscattered flux obtained bymore » the three Monte Carlo codes with an independent straightforward calculation. All codes and particularly the MCNP calculate accurately the absorbed dose rate in air due to the unscattered radiation. For the total radiation (unscattered plus scattered) the {dot D}{sub CF} values calculated from the three codes are in very good agreement between them. The comparison between these results and the results deduced previously by other authors indicates a good agreement (less than 15% of difference) for photon energies above 1,500 keV. Antithetically, the agreement is not as good (difference of 20--30%) for the low energy photons.« less

  1. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  2. Cancer risk coefficient for patient undergoing kyphoplasty surgery using Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Santos, Felipe A.; Santos, William S.; Galeano, Diego C.; Cavalcante, Fernanda R.; Silva, Ademir X.; Souza, Susana O.; Júnior, Albérico B. Carvalho

    2017-11-01

    Kyphoplasty surgery is widely used for pain relief in patients with vertebral compression fracture (VCF). For this surgery, an X-ray emitter that provides real-time imaging is employed to guide the medical instruments and the surgical cement used to fill and strengthen the vertebra. Equivalent and effective doses related to high temporal resolution equipment has been studied to assess the damage and more recently cancer risk. For this study, a virtual scenario was prepared using MCNPX code and a pair of UF family simulators. Two projections with seven tube voltages for each one were simulated. The organ in the abdominal region were those who had higher cancer risk because they receive the primary beam. The risk of lethal cancer is on average 20% higher in AP projection than in LL projection. This study aims at estimating the risk of cancer in organs and the risk of lethal cancer for patient submitted to kyphoplasty surgery.

  3. A new dynamical atmospheric ionizing radiation (AIR) model for epidemiological studies

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clem, J. M.; Goldhagen, P. E.; Wilson, J. W.

    2003-01-01

    A new Atmospheric Ionizing Radiation (AIR) model is currently being developed for use in radiation dose evaluation in epidemiological studies targeted to atmospheric flight personnel such as civilian airlines crewmembers. The model will allow computing values for biologically relevant parameters, e.g. dose equivalent and effective dose, for individual flights from 1945. Each flight is described by its actual three dimensional flight profile, i.e. geographic coordinates and altitudes varying with time. Solar modulated primary particles are filtered with a new analytical fully angular dependent geomagnetic cut off rigidity model, as a function of latitude, longitude, arrival direction, altitude and time. The particle transport results have been obtained with a technique based on the three-dimensional Monte Carlo transport code FLUKA, with a special procedure to deal with HZE particles. Particle fluxes are transformed into dose-related quantities and then integrated all along the flight path to obtain the overall flight dose. Preliminary validations of the particle transport technique using data from the AIR Project ER-2 flight campaign of measurements are encouraging. Future efforts will deal with modeling of the effects of the aircraft structure as well as inclusion of solar particle events. Published by Elsevier Ltd on behalf of COSPAR.

  4. Neutron-Irradiated Samples as Test Materials for MPEX

    DOE PAGES

    Ellis, Ronald James; Rapp, Juergen

    2015-10-09

    Plasma Material Interaction (PMI) is a major concern in fusion reactor design and analysis. The Material-Plasma Exposure eXperiment (MPEX) will explore PMI under fusion reactor plasma conditions. Samples with accumulated displacements per atom (DPA) damage produced by fast neutron irradiations in the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) will be studied in the MPEX facility. This paper presents assessments of the calculated induced radioactivity and resulting radiation dose rates of a variety of potential fusion reactor plasma-facing materials (such as tungsten). The scientific code packages MCNP and SCALE were used to simulate irradiation of themore » samples in HFIR including the generation and depletion of nuclides in the material and the subsequent composition, activity levels, gamma radiation fields, and resultant dose rates as a function of cooling time. A challenge of the MPEX project is to minimize the radioactive inventory in the preparation of the samples and the sample dose rates for inclusion in the MPEX facility.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorissen, BL; Giantsoudi, D; Unkelbach, J

    Purpose: Cell survival experiments suggest that the relative biological effectiveness (RBE) of proton beams depends on linear energy transfer (LET), leading to higher RBE near the end of range. With intensity-modulated proton therapy (IMPT), multiple treatment plans that differ in the dose contribution per field may yield a similar physical dose distribution, but the RBE-weighted dose distribution may be disparate. RBE models currently do not have the required predictive power to be included in an optimization model due to the variations in experimental data. We propose an LET-based planning method that guides IMPT optimization models towards plans with reduced RBE-weightedmore » dose in surrounding organs at risk (OARs) compared to inverse planning based on physical dose alone. Methods: Optimization models for physical dose are extended with a term for dose times LET (doseLET). Monte Carlo code is used to generate the physical dose and doseLET distribution of each individual pencil beam. The method is demonstrated for an atypical meningioma patient where the target volume abuts the brainstem and partially overlaps with the optic nerve. Results: A reference plan optimized based on physical dose alone yields high doseLET values in parts of the brainstem and optic nerve. Minimizing doseLET in these critical structures as an additional planning goal reduces the risk of high RBE-weighted dose. The resulting treatment plan avoids the distal fall-off of the Bragg peaks for shaping the dose distribution in front of critical stuctures. The maximum dose in the OARs evaluated with RBE models from literature is reduced by 8–14\\% with our method compared to conventional planning. Conclusion: LET-based inverse planning for IMPT offers the ability to reduce the RBE-weighted dose in OARs without sacrificing target dose. This project was in part supported by NCI - U19 CA 21239.« less

  6. Comparison of fluence-to-dose conversion coefficients for deuterons, tritons and helions.

    PubMed

    Copeland, Kyle; Friedberg, Wallace; Sato, Tatsuhiko; Niita, Koji

    2012-02-01

    Secondary radiation in aircraft and spacecraft includes deuterons, tritons and helions. Two sets of fluence-to-effective dose conversion coefficients for isotropic exposure to these particles were compared: one used the particle and heavy ion transport code system (PHITS) radiation transport code coupled with the International Commission on Radiological Protection (ICRP) reference phantoms (PHITS-ICRP) and the other the Monte Carlo N-Particle eXtended (MCNPX) radiation transport code coupled with modified BodyBuilder™ phantoms (MCNPX-BB). Also, two sets of fluence-to-effective dose equivalent conversion coefficients calculated using the PHITS-ICRP combination were compared: one used quality factors based on linear energy transfer; the other used quality factors based on lineal energy (y). Finally, PHITS-ICRP effective dose coefficients were compared with PHITS-ICRP effective dose equivalent coefficients. The PHITS-ICRP and MCNPX-BB effective dose coefficients were similar, except at high energies, where MCNPX-BB coefficients were higher. For helions, at most energies effective dose coefficients were much greater than effective dose equivalent coefficients. For deuterons and tritons, coefficients were similar when their radiation weighting factor was set to 2.

  7. An Update of Recent Phits Code

    NASA Astrophysics Data System (ADS)

    Sihver, Lembit; Sato, Tatsuhiko; Niita, Koji; Iwase, Hiroshi; Iwamoto, Yosuke; Matsuda, Norihiro; Nakashima, Hiroshi; Sakamoto, Yukio; Gustafsson, Katarina; Mancusi, Davide

    We will first present the current status of the General-Purpose Particle and Heavy-Ion Transport code System (PHITS). In particular, we will describe benchmarking of calculated cross sections against measurements; we will introduce a relativistically covariant version of JQMD, called R- JQMD, that features an improved ground-state initialization algorithm, and we will show heavyion charge-changing cross sections simulated with R-JQMD and compare them to experimental data and to results predicted by the JQMD model. We will also show calculations of dose received by aircrews and personnel in space from cosmic radiation. In recent years, many countries have issued regulations or recommendations to set annual dose limitations for aircrews. Since estimation of cosmic-ray spectra in the atmosphere is an essential issue for the evaluation of aviation doses we have calculated these spectra using PHITS. The accuracy of the simulation, which has well been verified by experimental data taken under various conditions, will be presented together with a software called EXPACS-V, that can visualize the cosmic-ray dose rates at ground level or at a certain altitude on the map of Google Earth, using the PHITS based Analytical Radiation Model in the Atmosphere (PARMA). PARMA can instantaneously calculate the cosmic-ray spectra anywhere in the world by specifying the atmospheric depth, the vertical cut-off rigidity and the force-field potential. For the purpose of examining the applicability of PHITS to the shielding design in space, the absorbed doses in a tissue equivalent water phantom inside an imaginary space vessel has been estimated for different shielding materials of different thicknesses. The results confirm previous results which indicate that PHITS is a suitable tool when performing shielding design studies of spacecrafts. Finally we have used PHITS for the calculations of depth-dose distributions in MATROSHKA, which is an ESA project dedicated to determining the radiation load on astronauts within and outside the International Space Station (ISS).

  8. Comparison of Organ Dosimetry for Astronaut Phantoms: Earth-Based vs. Microgravity-Based Anthropometry and Body Positioning

    NASA Technical Reports Server (NTRS)

    VanBaalen, Mary; Bahadon, Amir; Shavers, Mark; Semones, Edward

    2011-01-01

    The purpose of this study is to use NASA radiation transport codes to compare astronaut organ dose equivalents resulting from solar particle events (SPE), geomagnetically trapped protons, and free-space galactic cosmic rays (GCR) using phantom models representing Earth-based and microgravity-based anthropometry and positioning. Methods: The Univer sity of Florida hybrid adult phantoms were scaled to represent male and female astronauts with 5th, 50th, and 95th percentile heights and weights as measured on Earth. Another set of scaled phantoms, incorporating microgravity-induced changes, such as spinal lengthening, leg volume loss, and the assumption of the neutral body position, was also created. A ray-tracer was created and used to generate body self-shielding distributions for dose points within a voxelized phantom under isotropic irradiation conditions, which closely approximates the free-space radiation environment. Simplified external shielding consisting of an aluminum spherical shell was used to consider the influence of a spacesuit or shielding of a hull. These distributions were combined with depth dose distributions generated from the NASA radiation transport codes BRYNTRN (SPE and trapped protons) and HZETRN (GCR) to yield dose equivalent. Many points were sampled per organ. Results: The organ dos e equivalent rates were on the order of 1.5-2.5 mSv per day for GCR (1977 solar minimum) and 0.4-0.8 mSv per day for trapped proton irradiation with shielding of 2 g cm-2 aluminum equivalent. The organ dose equivalents for SPE irradiation varied considerably, with the skin and eye lens having the highest organ dose equivalents and deep-seated organs, such as the bladder, liver, and stomach having the lowest. Conclus ions: The greatest differences between the Earth-based and microgravity-based phantoms are observed for smaller ray thicknesses, since the most drastic changes involved limb repositioning and not overall phantom size. Improved self-shielding models reduce the overall uncertainty in organ dosimetry for mission-risk projections and assessments for astronauts

  9. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    NASA Astrophysics Data System (ADS)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  10. Comparison of a 3-D multi-group SN particle transport code with Monte Carlo for intracavitary brachytherapy of the cervix uteri.

    PubMed

    Gifford, Kent A; Wareing, Todd A; Failla, Gregory; Horton, John L; Eifel, Patricia J; Mourtada, Firas

    2009-12-03

    A patient dose distribution was calculated by a 3D multi-group S N particle transport code for intracavitary brachytherapy of the cervix uteri and compared to previously published Monte Carlo results. A Cs-137 LDR intracavitary brachytherapy CT data set was chosen from our clinical database. MCNPX version 2.5.c, was used to calculate the dose distribution. A 3D multi-group S N particle transport code, Attila version 6.1.1 was used to simulate the same patient. Each patient applicator was built in SolidWorks, a mechanical design package, and then assembled with a coordinate transformation and rotation for the patient. The SolidWorks exported applicator geometry was imported into Attila for calculation. Dose matrices were overlaid on the patient CT data set. Dose volume histograms and point doses were compared. The MCNPX calculation required 14.8 hours, whereas the Attila calculation required 22.2 minutes on a 1.8 GHz AMD Opteron CPU. Agreement between Attila and MCNPX dose calculations at the ICRU 38 points was within +/- 3%. Calculated doses to the 2 cc and 5 cc volumes of highest dose differed by not more than +/- 1.1% between the two codes. Dose and DVH overlays agreed well qualitatively. Attila can calculate dose accurately and efficiently for this Cs-137 CT-based patient geometry. Our data showed that a three-group cross-section set is adequate for Cs-137 computations. Future work is aimed at implementing an optimized version of Attila for radiotherapy calculations.

  11. Comparison of a 3D multi‐group SN particle transport code with Monte Carlo for intercavitary brachytherapy of the cervix uteri

    PubMed Central

    Wareing, Todd A.; Failla, Gregory; Horton, John L.; Eifel, Patricia J.; Mourtada, Firas

    2009-01-01

    A patient dose distribution was calculated by a 3D multi‐group SN particle transport code for intracavitary brachytherapy of the cervix uteri and compared to previously published Monte Carlo results. A Cs‐137 LDR intracavitary brachytherapy CT data set was chosen from our clinical database. MCNPX version 2.5.c, was used to calculate the dose distribution. A 3D multi‐group SN particle transport code, Attila version 6.1.1 was used to simulate the same patient. Each patient applicator was built in SolidWorks, a mechanical design package, and then assembled with a coordinate transformation and rotation for the patient. The SolidWorks exported applicator geometry was imported into Attila for calculation. Dose matrices were overlaid on the patient CT data set. Dose volume histograms and point doses were compared. The MCNPX calculation required 14.8 hours, whereas the Attila calculation required 22.2 minutes on a 1.8 GHz AMD Opteron CPU. Agreement between Attila and MCNPX dose calculations at the ICRU 38 points was within ±3%. Calculated doses to the 2 cc and 5 cc volumes of highest dose differed by not more than ±1.1% between the two codes. Dose and DVH overlays agreed well qualitatively. Attila can calculate dose accurately and efficiently for this Cs‐137 CT‐based patient geometry. Our data showed that a three‐group cross‐section set is adequate for Cs‐137 computations. Future work is aimed at implementing an optimized version of Attila for radiotherapy calculations. PACS number: 87.53.Jw

  12. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.

    1995-01-01

    This report focuses on the results obtained during the PI's recent sabbatical leave at the Swiss Federal Institute of Technology (ETH) in Zurich, Switzerland, from January 1, 1995 through June 30, 1995. Two projects investigated various properties of TURBO codes, a new form of concatenated coding that achieves near channel capacity performance at moderate bit error rates. The performance of TURBO codes is explained in terms of the code's distance spectrum. These results explain both the near capacity performance of the TURBO codes and the observed 'error floor' for moderate and high signal-to-noise ratios (SNR's). A semester project, entitled 'The Realization of the Turbo-Coding System,' involved a thorough simulation study of the performance of TURBO codes and verified the results claimed by previous authors. A copy of the final report for this project is included as Appendix A. A diploma project, entitled 'On the Free Distance of Turbo Codes and Related Product Codes,' includes an analysis of TURBO codes and an explanation for their remarkable performance. A copy of the final report for this project is included as Appendix B.

  13. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes

    NASA Astrophysics Data System (ADS)

    Aghara, S. K.; Sriprisan, S. I.; Singleterry, R. C.; Sato, T.

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm2 Al shield followed by 30 g/cm2 of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E < 100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results.

  14. Space Radiation Dosimetry to Evaluate the Effect of Polyethylene Shielding in the Russian Segment of the International Space Station

    NASA Astrophysics Data System (ADS)

    Nagamatsu, Aiko; Casolino, Marco; Larsson, Oscar; Ito, Tsuyoshi; Yasuda, Nakahiro; Kitajo, Keiichi; Shimada, Ken; Takeda, Kazuo; Tsuda, Shuichi; Sato, Tatsuhiko

    As a part of the Alteino Long Term Cosmic Ray measurements on board the International Space Station (ALTCRISS) project, the shielding effect of polyethylene (PE) were evaluated in the Russian segment of the ISS, using active and passive dosimeter systems covered with or without PE shielding. For the passive dosimeter system, PADLES (Passive Dosimeter for Life-Science and Experiments in Space) was used in the project, which consists of a Thermo-Luminescent Dosimeters (TLD) and CR-39 Plastic Nuclear Track Detectors (PNTDs) attached to a radiator. Not only CR-39 PNTD itself but also a tissue equivalent material, NAN-JAERI, were employed as the radiator in order to investigate whether CR-39 PNTD can be used as a surrogate of tissue equivalent material in space dosimetry or not. The agreements between the doses measured by PADLES with CR-39 PNTD and NAN-JAERI radiators were quite satisfactorily, indicating the tissue-equivalent dose can be measured by conventional PADLES even though CR-39 PNTD is not perfect tissue-equivalent material. It was found that the shielding effect of PE varies with location inside the spacecraft: it became less significant with an increase of the mean thickness of the wall. This tendency was also verified by Monte Carlo simulation using the PHITS code. Throughout the flight experiments, in a series of four phases in the ALTCRISS project from December 2005 to October 2007, we assessed the ability of PE to decrease radiation doses in Low Earth Orbit(LEO).

  15. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  16. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  17. TH-A-19A-11: Validation of GPU-Based Monte Carlo Code (gPMC) Versus Fully Implemented Monte Carlo Code (TOPAS) for Proton Radiation Therapy: Clinical Cases Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giantsoudi, D; Schuemann, J; Dowdell, S

    Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavitiesmore » and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.« less

  18. SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.

    PubMed

    Liu, T; Ding, A; Xu, X

    2012-06-01

    To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.

  19. Use of computer code for dose distribution studies in A 60CO industrial irradiator

    NASA Astrophysics Data System (ADS)

    Piña-Villalpando, G.; Sloan, D. P.

    1995-09-01

    This paper presents a benchmark comparison between calculated and experimental absorbed dose values tor a typical product, in a 60Co industrial irradiator, located at ININ, México. The irradiator is a two levels, two layers system with overlapping product configuration with activity around 300kCi. Experimental values were obtanied from routine dosimetry, using red acrylic pellets. Typical product was Petri dishes packages, apparent density 0.13 g/cm3; that product was chosen because uniform size, large quantity and low density. Minimum dose was fixed in 15 kGy. Calculated values were obtained from QAD-CGGP code. This code uses a point kernel technique, build-up factors fitting was done by geometrical progression and combinatorial geometry is used for system description. Main modifications for the code were related with source sumilation, using punctual sources instead of pencils and an energy and anisotropic emission spectrums were included. Results were, for maximum dose, calculated value (18.2 kGy) was 8% higher than experimental average value (16.8 kGy); for minimum dose, calculated value (13.8 kGy) was 3% higher than experimental average value (14.3 kGy).

  20. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes.

    PubMed

    Aghara, S K; Sriprisan, S I; Singleterry, R C; Sato, T

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm(2) Al shield followed by 30 g/cm(2) of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E<100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  1. Predicted blood glucose from insulin administration based on values from miscoded glucose meters.

    PubMed

    Raine, Charles H; Pardo, Scott; Parkes, Joan Lee

    2008-07-01

    The proper use of many types of self-monitored blood glucose (SMBG) meters requires calibration to match strip code. Studies have demonstrated the occurrence and impact on insulin dose of coding errors with SMBG meters. This paper reflects additional analyses performed with data from Raine et al. (JDST, 2:205-210, 2007). It attempts to relate potential insulin dose errors to possible adverse blood glucose outcomes when glucose meters are miscoded. Five sets of glucose meters were used. Two sets of meters were autocoded and therefore could not be miscoded, and three sets required manual coding. Two of each set of manually coded meters were deliberately miscoded, and one from each set was properly coded. Subjects (n = 116) had finger stick blood glucose obtained at fasting, as well as at 1 and 2 hours after a fixed meal (Boost((R)); Novartis Medical Nutrition U.S., Basel, Switzerland). Deviations of meter blood glucose results from the reference method (YSI) were used to predict insulin dose errors and resultant blood glucose outcomes based on these deviations. Using insulin sensitivity data, it was determined that, given an actual blood glucose of 150-400 mg/dl, an error greater than +40 mg/dl would be required to calculate an insulin dose sufficient to produce a blood glucose of less than 70 mg/dl. Conversely, an error less than or equal to -70 mg/dl would be required to derive an insulin dose insufficient to correct an elevated blood glucose to less than 180 mg/dl. For miscoded meters, the estimated probability to produce a blood glucose reduction to less than or equal to 70 mg/dl was 10.40%. The corresponding probabilities for autocoded and correctly coded manual meters were 2.52% (p < 0.0001) and 1.46% (p < 0.0001), respectively. Furthermore, the errors from miscoded meters were large enough to produce a calculated blood glucose outcome less than or equal to 50 mg/dl in 42 of 833 instances. Autocoded meters produced zero (0) outcomes less than or equal to 50 mg/dl out of 279 instances, and correctly coded manual meters produced 1 of 416. Improperly coded blood glucose meters present the potential for insulin dose errors and resultant clinically significant hypoglycemia or hyperglycemia. Patients should be instructed and periodically reinstructed in the proper use of blood glucose meters, particularly for meters that require coding.

  2. [Trial of eye drops recognizer for visually disabled persons].

    PubMed

    Okamoto, Norio; Suzuki, Katsuhiko; Mimura, Osamu

    2009-01-01

    The development of a device to enable the visually disabled to differentiate eye drops and their dose. The new instrument is composed of a voice generator and a two-dimensional bar-code reader (LS9208). We designed voice outputs for the visually disabled to state when (number of times) and where (right, left, or both) to administer eye drops. We then determined the minimum bar-code size that can be recognized. After attaching bar-codes of the appropriate size to the lateral or bottom surface of the eye drops container, the readability of the bar-codes was compared. The minimum discrimination bar-code size was 6 mm high x 8.5 mm long. Bar-codes on the bottom surface could be more easily recognized than bar-codes on the side. Our newly-developed device using bar-codes enables visually disabled persons to differentiate eye drops and their doses.

  3. Experimental check of bremsstrahlung dosimetry predictions for 0.75 MeV electrons

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Halbleib, J. A.; Beezhold, W.

    Bremsstrahlung dose in CaF2 TLDs from the radiation produced by 0.75 MeV electrons incident on Ta/C targets is measured and compared with that calculated via the CYLTRAN Monte Carlo code. The comparison was made to validate the code, which is used to predict and analyze radiation environments of flash X-ray simulators measured by TLDs. Over a wide range of Ta target thicknesses and radiation angles the code is found to agree with the 5% measurements. For Ta thickness near those that optimize the radiation output, however, the code overestimates the radiation dose at small angles. Maximum overprediction is about 14 + or - 5%. The general agreement, nonetheless, gives confidence in using the code at this energy and in the TLD calibration procedure. For the bulk of the measurements, a standard TLD employing a 2.2 mm thick Al equilibrator was used. In this paper we also show that this thickness can significantly attenuate the free-field dose and introduces significant photon buildup in the equalibrator.

  4. Extension of applicable neutron energy of DARWIN up to 1 GeV.

    PubMed

    Satoh, D; Sato, T; Endo, A; Matsufuji, N; Takada, M

    2007-01-01

    The radiation-dose monitor, DARWIN, needs a set of response functions of the liquid organic scintillator to assess a neutron dose. SCINFUL-QMD is a Monte Carlo based computer code to evaluate the response functions. In order to improve the accuracy of the code, a new light-output function based on the experimental data was developed for the production and transport of protons deuterons, tritons, (3)He nuclei and alpha particles, and incorporated into the code. The applicable energy of DARWIN was extended to 1 GeV using the response functions calculated by the modified SCINFUL-QMD code.

  5. Analysis of activation and shutdown contact dose rate for EAST neutral beam port

    NASA Astrophysics Data System (ADS)

    Chen, Yuqing; Wang, Ji; Zhong, Guoqiang; Li, Jun; Wang, Jinfang; Xie, Yahong; Wu, Bin; Hu, Chundong

    2017-12-01

    For the safe operation and maintenance of neutral beam injector (NBI), specific activity and shutdown contact dose rate of the sample material SS316 are estimated around the experimental advanced superconducting tokamak (EAST) neutral beam port. Firstly, the neutron emission intensity is calculated by TRANSP code while the neutral beam is co-injected to EAST. Secondly, the neutron activation and shutdown contact dose rates for the neutral beam sample materials SS316 are derived by the Monte Carlo code MCNP and the inventory code FISPACT-2007. The simulations indicate that the primary radioactive nuclides of SS316 are 58Co and 54Mn. The peak contact dose rate is 8.52 × 10-6 Sv/h after EAST shutdown one second. That is under the International Thermonuclear Experimental Reactor (ITER) design values 1 × 10-5 Sv/h.

  6. Galactic and solar radiation exposure to aircrew during a solar cycle.

    PubMed

    Lewis, B J; Bennett, L G I; Green, A R; McCall, M J; Ellaschuk, B; Butler, A; Pierre, M

    2002-01-01

    An on-going investigation using a tissue-equivalent proportional counter (TEPC) has been carried out to measure the ambient dose equivalent rate of the cosmic radiation exposure of aircrew during a solar cycle. A semi-empirical model has been derived from these data to allow for the interpolation of the dose rate for any global position. The model has been extended to an altitude of up to 32 km with further measurements made on board aircraft and several balloon flights. The effects of changing solar modulation during the solar cycle are characterised by correlating the dose rate data to different solar potential models. Through integration of the dose-rate function over a great circle flight path or between given waypoints, a Predictive Code for Aircrew Radiation Exposure (PCAIRE) has been further developed for estimation of the route dose from galactic cosmic radiation exposure. This estimate is provided in units of ambient dose equivalent as well as effective dose, based on E/H x (10) scaling functions as determined from transport code calculations with LUIN and FLUKA. This experimentally based treatment has also been compared with the CARI-6 and EPCARD codes that are derived solely from theoretical transport calculations. Using TEPC measurements taken aboard the International Space Station, ground based neutron monitoring, GOES satellite data and transport code analysis, an empirical model has been further proposed for estimation of aircrew exposure during solar particle events. This model has been compared to results obtained during recent solar flare events.

  7. TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, S; Nazareth, D; Bellor, M

    Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less

  8. The cost of implementing inpatient bar code medication administration.

    PubMed

    Sakowski, Julie Ann; Ketchel, Alan

    2013-02-01

    To calculate the costs associated with implementing and operating an inpatient bar-code medication administration (BCMA) system in the community hospital setting and to estimate the cost per harmful error prevented. This is a retrospective, observational study. Costs were calculated from the hospital perspective and a cost-consequence analysis was performed to estimate the cost per preventable adverse drug event averted. Costs were collected from financial records and key informant interviews at 4 not-for profit community hospitals. Costs included direct expenditures on capital, infrastructure, additional personnel, and the opportunity costs of time for existing personnel working on the project. The number of adverse drug events prevented using BCMA was estimated by multiplying the number of doses administered using BCMA by the rate of harmful errors prevented by interventions in response to system warnings. Our previous work found that BCMA identified and intercepted medication errors in 1.1% of doses administered, 9% of which potentially could have resulted in lasting harm. The cost of implementing and operating BCMA including electronic pharmacy management and drug repackaging over 5 years is $40,000 (range: $35,600 to $54,600) per BCMA-enabled bed and $2000 (range: $1800 to $2600) per harmful error prevented. BCMA can be an effective and potentially cost-saving tool for preventing the harm and costs associated with medication errors.

  9. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    NASA Astrophysics Data System (ADS)

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; Wahl, W.

    2016-11-01

    Third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and ramped operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. This made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.

  10. ACDOS2: an improved neutron-induced dose rate code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lagache, J.C.

    1981-06-01

    To calculate the expected dose rate from fusion reactors as a function of geometry, composition, and time after shutdown a computer code, ACDOS2, was written, which utilizes up-to-date libraries of cross-sections and radioisotope decay data. ACDOS2 is in ANSI FORTRAN IV, in order to make it readily adaptable elsewhere.

  11. Calculation of the effective dose from natural radioactivity in soil using MCNP code.

    PubMed

    Krstic, D; Nikezic, D

    2010-01-01

    Effective dose delivered by photon emitted from natural radioactivity in soil was calculated in this work. Calculations have been done for the most common natural radionuclides in soil (238)U, (232)Th series and (40)K. A ORNL human phantoms and the Monte Carlo transport code MCNP-4B were employed to calculate the energy deposited in all organs. The effective dose was calculated according to ICRP 74 recommendations. Conversion factors of effective dose per air kerma were determined. Results obtained here were compared with other authors. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  13. Comparison of CREME (cosmic-ray effects on microelectronics) model LET (linear energy transfer) spaceflight dosimetry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letaw, J.R.; Adams, J.H.

    The galactic cosmic radiation (GCR) component of space radiation is the dominant cause of single-event phenomena in microelectronic circuits when Earth's magnetic shielding is low. Spaceflights outside the magnetosphere and in high inclination orbits are examples of such circumstances. In high-inclination orbits, low-energy (high LET) particles are transmitted through the field only at extreme latitudes, but can dominate the orbit-averaged dose. GCR is an important part of the radiation dose to astronauts under the same conditions. As a test of the CREME environmental model and particle transport codes used to estimate single event upsets, we have compiled existing measurements ofmore » HZE doses were compiled where GCR is expected to be important: Apollo 16 and 17, Skylab, Apollo Soyuz Test Project, and Kosmos 782. The LET spectra, due to direct ionization from GCR, for each of these missions has been estimated. The resulting comparisons with data validate the CREME model predictions of high-LET galactic cosmic-ray fluxes to within a factor of two. Some systematic differences between the model and data are identified.« less

  14. Comparison of normal tissue dose calculation methods for epidemiological studies of radiotherapy patients.

    PubMed

    Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik

    2018-06-01

    Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.

  15. The FLUKA Monte Carlo code coupled with the NIRS approach for clinical dose calculations in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.

    2017-05-01

    Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.

  16. Comparison of Calculations and Measurements of the Off-Axis Radiation Dose (SI) in Liquid Nitrogen as a Function of Radiation Length.

    DTIC Science & Technology

    1984-12-01

    radiation lengths. The off-axis dose in Silicon was calculated using the electron/photon transport code CYLTRAN and measured using thermal luminescent...various path lengths out to 2 radiation lengths. The cff-axis dose in Silicon was calculated using the electron/photon transport code CYLTRAN and measured... using thermal luminescent dosimeters (TLD’s). Calculations were performed on a CDC-7600 computer at Los Alamos National Laboratory and measurements

  17. Role of cellular communication in the pathways of radiation-induced biological damage

    NASA Astrophysics Data System (ADS)

    Ballarini, Francesca; Facoetti, Angelica; Mariotti, Luca; Nano, Rosanna; Ottolenghi, Andrea

    During the last decade, a large number of experimental studies on the so-called "non-targeted effects", in particular bystander effects, outlined that cellular communication plays a signifi- cant role in the pathways leading to radiation-induced biological damage. This might imply a paradigm shift in (low-dose) radiobiology, according to which one has to consider the response of groups of cells behaving like a population rather than single cells behaving as individuals. Furthermore, bystander effects, which are observed both for lethal endpoints (e.g. clonogenic inactivation and apoptosis) and for non-lethal ones (e.g. mutations and neoplastic transformation), tend to show non-linear dose responses characterized by a sharp increase followed by a plateau. This might have significant consequences in terms of low-dose risk, which is generally calculated on the basis of the "Linear No Threshold" hypothesis. Although it is known that two types of cellular communication (i.e. via gap junctions and/or molecular messengers diffusing in the extra-cellular environment, such as cytokines) play a major role, it is of utmost importance to better understand the underlying mechanisms, and how such mechanisms can be modulated by ionizing radiation. Though the "final" goal is to elucidate the in vivo scenario, in the meanwhile also in vitro studies can provide useful insights. In the present paper we will discuss key issues on the mechanisms underlying non-targeted effects and, more generally, cell communication, with focus on candidate molecular signals. Theoretical models and simulation codes can be of help in elucidating such mechanisms. In this framework, we will present a model and Monte Carlo code, under development at the University of Pavia, simulating the release, diffusion and internalization of candidate signals (typically cytokines) travelling in the extra-cellular environment, both by unirradiated (i.e., control) cells and by irradiated cells. The focus will be on the role of critical parameters such as the cell number and density, the amount of culture medium etc. Comparisons with ad hoc experimental data obtained in our laboratory will be presented, and possible implications in terms of low-dose risk assessment will be discussed. Work supported by the European Community (projects "RISC-RAD" and "NOTE") and the Italian Space Agency (project "MoMa/COUNT)

  18. Performance of two commercial electron beam algorithms over regions close to the lung-mediastinum interface, against Monte Carlo simulation and point dosimetry in virtual and anthropomorphic phantoms.

    PubMed

    Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R

    2014-03-01

    Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Energetic properties' investigation of removing flattening filter at phantom surface: Monte Carlo study using BEAMnrc code, DOSXYZnrc code and BEAMDP code

    NASA Astrophysics Data System (ADS)

    Bencheikh, Mohamed; Maghnouj, Abdelmajid; Tajmouati, Jaouad

    2017-11-01

    The Monte Carlo calculation method is considered to be the most accurate method for dose calculation in radiotherapy and beam characterization investigation, in this study, the Varian Clinac 2100 medical linear accelerator with and without flattening filter (FF) was modelled. The objective of this study was to determine flattening filter impact on particles' energy properties at phantom surface in terms of energy fluence, mean energy, and energy fluence distribution. The Monte Carlo codes used in this study were BEAMnrc code for simulating linac head, DOSXYZnrc code for simulating the absorbed dose in a water phantom, and BEAMDP for extracting energy properties. Field size was 10 × 10 cm2, simulated photon beam energy was 6 MV and SSD was 100 cm. The Monte Carlo geometry was validated by a gamma index acceptance rate of 99% in PDD and 98% in dose profiles, gamma criteria was 3% for dose difference and 3mm for distance to agreement. In without-FF, the energetic properties was as following: electron contribution was increased by more than 300% in energy fluence, almost 14% in mean energy and 1900% in energy fluence distribution, however, photon contribution was increased 50% in energy fluence, and almost 18% in mean energy and almost 35% in energy fluence distribution. The removing flattening filter promotes the increasing of electron contamination energy versus photon energy; our study can contribute in the evolution of removing flattening filter configuration in future linac.

  20. SU-E-T-467: Monte Carlo Dosimetric Study of the New Flexisource Co-60 High Dose Rate Source.

    PubMed

    Vijande, J; Granero, D; Perez-Calatayud, J; Ballester, F

    2012-06-01

    Recently, a new HDR 60Co brachytherapy source, Flexisource Co-60, has been developed (Nucletron B.V.). This study aims to obtain quality dosimetric data for this source for its use in clinical practice as required by AAPM and ESTRO. Penelope2008 and GEANT4 Monte Carlo codes were used to dosimetrically characterize this source. Water composition and mass density was that recommended by AAPM. Due to the high energy of the 60Co, dose for small distances cannot be approximated by collisional kerma. Therefore, we have considered absorbed dose to water for r<0.75 cm and collisional kerma from 0.75 0.8 cm and up to 2% closer to the source. Using Penelope2008 and GEANT4, an average of Î> = 1.085±0.003 cGy/(h U) (with k = 1, Type A uncertainties) was obtained. Dose rate constant, radial dose function and anisotropy functions for the Flexisource Co-60 are compared with published data for other Co-60 sources. Dosimetric data are provided for the new Flexisource Co-60 source not studied previously in the literature. Using the data provided by this study in the treatment planning systems, it can be used in clinical practice. This project has been funded by Nucletron BV. © 2012 American Association of Physicists in Medicine.

  1. Predicted Blood Glucose from Insulin Administration Based on Values from Miscoded Glucose Meters

    PubMed Central

    Raine, Charles H.; Pardo, Scott; Parkes, Joan Lee

    2008-01-01

    Objectives The proper use of many types of self-monitored blood glucose (SMBG) meters requires calibration to match strip code. Studies have demonstrated the occurrence and impact on insulin dose of coding errors with SMBG meters. This paper reflects additional analyses performed with data from Raine et al. (JDST, 2:205–210, 2007). It attempts to relate potential insulin dose errors to possible adverse blood glucose outcomes when glucose meters are miscoded. Methods Five sets of glucose meters were used. Two sets of meters were autocoded and therefore could not be miscoded, and three sets required manual coding. Two of each set of manually coded meters were deliberately miscoded, and one from each set was properly coded. Subjects (n = 116) had finger stick blood glucose obtained at fasting, as well as at 1 and 2 hours after a fixed meal (Boost®; Novartis Medical Nutrition U.S., Basel, Switzerland). Deviations of meter blood glucose results from the reference method (YSI) were used to predict insulin dose errors and resultant blood glucose outcomes based on these deviations. Results Using insulin sensitivity data, it was determined that, given an actual blood glucose of 150–400 mg/dl, an error greater than +40 mg/dl would be required to calculate an insulin dose sufficient to produce a blood glucose of less than 70 mg/dl. Conversely, an error less than or equal to -70 mg/dl would be required to derive an insulin dose insufficient to correct an elevated blood glucose to less than 180 mg/dl. For miscoded meters, the estimated probability to produce a blood glucose reduction to less than or equal to 70 mg/dl was 10.40%. The corresponding probabilities for autocoded and correctly coded manual meters were 2.52% (p < 0.0001) and 1.46% (p < 0.0001), respectively. Furthermore, the errors from miscoded meters were large enough to produce a calculated blood glucose outcome less than or equal to 50 mg/dl in 42 of 833 instances. Autocoded meters produced zero (0) outcomes less than or equal to 50 mg/dl out of 279 instances, and correctly coded manual meters produced 1 of 416. Conclusions Improperly coded blood glucose meters present the potential for insulin dose errors and resultant clinically significant hypoglycemia or hyperglycemia. Patients should be instructed and periodically reinstructed in the proper use of blood glucose meters, particularly for meters that require coding. PMID:19885229

  2. C++ Coding Standards for the AMP Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smit, C; Plessis, F du

    Purpose: To extract the electron contamination energy spectra for an Elekta Precise Linac, based on pure photon and measured clinical beam percentage depth dose data. And to include this as an additional source in isource 4 in DOSXYZnrc. Methods: A pure photon beam was simulated for the Linac using isource 4 in the DOSXYZnrc Monte Carlo (MC) code. Percentage depth dose (PDD) data were extracted afterwards for a range of field sizes (FS). These simulated dose data were compared to actual measured dose PDD data, with the data normalized at 10 cm depth. The resulting PDD data resembled the electronmore » contamination depth dose. Since the dose fall-off is a strictly decreasing function, a method was adopted to derive the contamination electron spectrum. Afterwards this spectrum was used in a DOSXYZnrc MC simulation run to verify that the original electron depth dose could be replicated. Results: Various square aperture FS’s for 6, 8 and 15 megavolt (MV) photon beams were modeled, simulated and compared to their respective actual measured PDD data. As FS increased, simulated pure photon depth-dose profiles shifted deeper, thus requiring electron contamination to increase the surface dose. The percentage of electron weight increased with increase in FS. For a FS of 15×15 cm{sup 2}, the percentage electron weight is 0.1%, 0.2% and 0.4% for 6, 8 and 15 MV beams respectively. Conclusion: From the PDD results obtained, an additional electron contamination source was added to the photon source model so that simulation and measured PDD data could match within 2 % / 2 mm gamma-index criteria. The improved source model could assure more accurate simulations of surface doses. This research project was funded by the South African Medical Research Council (MRC) with funds from National Treasury under its Economic Competitiveness and Support package.« less

  4. Dose mapping using MCNP code and experiment for SVST-Co-60/B irradiator in Vietnam.

    PubMed

    Tran, Van Hung; Tran, Khac An

    2010-06-01

    By using MCNP code and ethanol-chlorobenzene (ECB) dosimeters the simulations and measurements of absorbed dose distribution in a tote-box of the Cobalt-60 irradiator, SVST-Co60/B at VINAGAMMA have been done. Based on the results Dose Uniformity Ratios (DUR), positions and values of minimum and maximum dose extremes in a tote-box, and efficiency of the irradiator for the different dummy densities have been gained. There is a good agreement between simulation and experimental results in comparison and they have valuable meanings for operation of the irradiator. Copyright 2010 Elsevier Ltd. All rights reserved.

  5. Suitability of point kernel dose calculation techniques in brachytherapy treatment planning

    PubMed Central

    Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.

    2010-01-01

    Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118

  6. The Monte Carlo code MCPTV--Monte Carlo dose calculation in radiation therapy with carbon ions.

    PubMed

    Karg, Juergen; Speer, Stefan; Schmidt, Manfred; Mueller, Reinhold

    2010-07-07

    The Monte Carlo code MCPTV is presented. MCPTV is designed for dose calculation in treatment planning in radiation therapy with particles and especially carbon ions. MCPTV has a voxel-based concept and can perform a fast calculation of the dose distribution on patient CT data. Material and density information from CT are taken into account. Electromagnetic and nuclear interactions are implemented. Furthermore the algorithm gives information about the particle spectra and the energy deposition in each voxel. This can be used to calculate the relative biological effectiveness (RBE) for each voxel. Depth dose distributions are compared to experimental data giving good agreement. A clinical example is shown to demonstrate the capabilities of the MCPTV dose calculation.

  7. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  8. Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study.

    PubMed

    Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald

    2015-03-21

    Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was demonstrated between our fast GPU-based MC code (gPMC) and a previously extensively validated multi-purpose MC code (TOPAS) for a comprehensive set of clinical patient cases. This shows that MC dose calculations in proton therapy can be performed on time scales comparable to analytical algorithms with accuracy comparable to state-of-the-art CPU-based MC codes.

  9. Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.

    PubMed

    Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C

    2004-01-01

    Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.

  10. Dosimetric evaluation of nanotargeted (188)Re-liposome with the MIRDOSE3 and OLINDA/EXM programs.

    PubMed

    Chang, Chih-Hsien; Chang, Ya-Jen; Lee, Te-Wei; Ting, Gann; Chang, Kwo-Ping

    2012-06-01

    The OLINDA/EXM computer code was created as a replacement for the widely used MIRDOSE3 code for radiation dosimetry in nuclear medicine. A dosimetric analysis with these codes was performed to evaluate nanoliposomes as carriers of radionuclides ((188)Re-liposomes) in colon carcinoma-bearing mice. Pharmacokinetic data for (188)Re-N, N-bis (2-mercaptoethyl)-N',N'-diethylethylenediamine ((188)Re-BMEDA) and (188)Re-liposome were obtained for estimation of absorbed doses in normal organs. Radiation dose estimates for normal tissues were calculated using the MIRDOSE3 and OLINDA/EXM programs for a colon carcinoma solid tumor mouse model. Mean absorbed doses derived from(188)Re-BMEDA and (188)Re-liposome in normal tissues were generally similar as calculated by MIRDOSE3 and OLINDA/EXM programs. One notable exception to this was red marrow, wherein MIRDOSE3 resulted in higher absorbed doses than OLINDA/EXM (1.53- and 1.60-fold for (188)Re-BMEDA and (188)Re-liposome, respectively). MIRDOSE3 and OLINDA have very similar residence times and organ doses. Bone marrow doses were estimated by designating cortical bone rather than bone marrow as a source organ. The bone marrow doses calculated by MIRDOSE3 are higher than those by OLINDA. If the bone marrow is designated as a source organ, the doses estimated by MIRDOSE3 and OLINDA programs will be very similar.

  11. Dose estimation for astronauts using dose conversion coefficients calculated with the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Sihver, Lembit; Niita, Koji

    2011-03-01

    Absorbed-dose and dose-equivalent rates for astronauts were estimated by multiplying fluence-to-dose conversion coefficients in the units of Gy.cm(2) and Sv.cm(2), respectively, and cosmic-ray fluxes around spacecrafts in the unit of cm(-2) s(-1). The dose conversion coefficients employed in the calculation were evaluated using the general-purpose particle and heavy ion transport code system PHITS coupled to the male and female adult reference computational phantoms, which were released as a common ICRP/ICRU publication. The cosmic-ray fluxes inside and near to spacecrafts were also calculated by PHITS, using simplified geometries. The accuracy of the obtained absorbed-dose and dose-equivalent rates was verified by various experimental data measured both inside and outside spacecrafts. The calculations quantitatively show that the effective doses for astronauts are significantly greater than their corresponding effective dose equivalents, because of the numerical incompatibility between the radiation quality factors and the radiation weighting factors. These results demonstrate the usefulness of dose conversion coefficients in space dosimetry. © Springer-Verlag 2010

  12. What to do with a Dead Research Code

    NASA Astrophysics Data System (ADS)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  13. Skyshine radiation from a pressurized water reactor containment dome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, W.H.

    1986-06-01

    The radiation dose rates resulting from airborne activities inside a postaccident pressurized water reactor containment are calculated by a discrete ordinates/Monte Carlo combined method. The calculated total dose rates and the skyshine component are presented as a function of distance from the containment at three different elevations for various gamma-ray source energies. The one-dimensional (ANISN code) is used to approximate the skyshine dose rates from the hemisphere dome, and the results are compared favorably to more rigorous results calculated by a three-dimensional Monte Carlo code.

  14. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  15. Effect of two doses of ginkgo biloba extract (EGb 761) on the dual-coding test in elderly subjects.

    PubMed

    Allain, H; Raoul, P; Lieury, A; LeCoz, F; Gandon, J M; d'Arbigny, P

    1993-01-01

    The subjects of this double-blind study were 18 elderly men and women (mean age, 69.3 years) with slight age-related memory impairment. In a crossover-study design, each subject received placebo or an extract of Ginkgo biloba (EGb 761) (320 mg or 600 mg) 1 hour before performing a dual-coding test that measures the speed of information processing; the test consists of several coding series of drawings and words presented at decreasing times of 1920, 960, 480, 240, and 120 ms. The dual-coding phenomenon (a break point between coding verbal material and images) was demonstrated in all the tests. After placebo, the break point was observed at 960 ms and dual coding beginning at 1920 ms. After each dose of the ginkgo extract, the break point (at 480 ms) and dual coding (at 960 ms) were significantly shifted toward a shorter presentation time, indicating an improvement in the speed of information processing.

  16. Skin dose from radionuclide contamination on clothing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, D.C.; Hussein, E.M.A.; Yuen, P.S.

    1997-06-01

    Skin dose due to radio nuclide contamination on clothing is calculated by Monte Carlo simulation of electron and photon radiation transport. Contamination due to a hot particle on some selected clothing geometries of cotton garment is simulated. The effect of backscattering in the surrounding air is taken into account. For each combination of source-clothing geometry, the dose distribution function in the skin, including the dose at tissue depths of 7 mg cm{sup -2} and 1,000 Mg cm{sup -2}, is calculated by simulating monoenergetic photon and electron sources. Skin dose due to contamination by a radionuclide is then determined by propermore » weighting of & monoenergetic dose distribution functions. The results are compared with the VARSKIN point-kernel code for some radionuclides, indicating that the latter code tends to under-estimate the dose for gamma and high energy beta sources while it overestimates skin dose for low energy beta sources. 13 refs., 4 figs., 2 tabs.« less

  17. Continuous Codes and Standards Improvement (CCSI)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivkin, Carl H; Burgess, Robert M; Buttner, William J

    2015-10-21

    As of 2014, the majority of the codes and standards required to initially deploy hydrogen technologies infrastructure in the United States have been promulgated. These codes and standards will be field tested through their application to actual hydrogen technologies projects. Continuous codes and standards improvement (CCSI) is a process of identifying code issues that arise during project deployment and then developing codes solutions to these issues. These solutions would typically be proposed amendments to codes and standards. The process is continuous because as technology and the state of safety knowledge develops there will be a need to monitor the applicationmore » of codes and standards and improve them based on information gathered during their application. This paper will discuss code issues that have surfaced through hydrogen technologies infrastructure project deployment and potential code changes that would address these issues. The issues that this paper will address include (1) setback distances for bulk hydrogen storage, (2) code mandated hazard analyses, (3) sensor placement and communication, (4) the use of approved equipment, and (5) system monitoring and maintenance requirements.« less

  18. 3DHZETRN: Inhomogeneous Geometry Issues

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.

    2017-01-01

    Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.

  19. An overview of new video coding tools under consideration for VP10: the successor to VP9

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  20. Improved neutron activation prediction code system development

    NASA Technical Reports Server (NTRS)

    Saqui, R. M.

    1971-01-01

    Two integrated neutron activation prediction code systems have been developed by modifying and integrating existing computer programs to perform the necessary computations to determine neutron induced activation gamma ray doses and dose rates in complex geometries. Each of the two systems is comprised of three computational modules. The first program module computes the spatial and energy distribution of the neutron flux from an input source and prepares input data for the second program which performs the reaction rate, decay chain and activation gamma source calculations. A third module then accepts input prepared by the second program to compute the cumulative gamma doses and/or dose rates at specified detector locations in complex, three-dimensional geometries.

  1. Hanford business structure for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    The Hanford Business Structure integrates the project`s technical, schedule, and cost baselines; implements the use of a standard code of accounts; and streamlines performance reporting and cost collection. Technical requirements drive the technical functions and come from the RDD 100 database. The functions will be identified in the P3 scheduling system and also in the PeopleSoft system. Projects will break their work down from the technical requirements in the P3 schedules. When the level at which they want to track cost via the code of accounts is reached, a Project ID will be generated in the PeopleSoft system. P3 maymore » carry more detailed schedules below the Project ID level. The standard code of accounts will identify discrete work activities done across the site and various projects. They will include direct and overhead type work scopes. Activities in P3 will roll up to this standard code of accounts. The field that will be used to record this in PeopleSoft is ``Activity``. In Passport it is a user-defined field. It will have to be added to other feeder systems. Project ID and code of accounts are required fields on all cost records.« less

  2. NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview

    NASA Technical Reports Server (NTRS)

    Budinger, James M.

    1992-01-01

    The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.

  3. Monte Carlo dose calculations in homogeneous media and at interfaces: a comparison between GEPTS, EGSnrc, MCNP, and measurements.

    PubMed

    Chibani, Omar; Li, X Allen

    2002-05-01

    Three Monte Carlo photon/electron transport codes (GEPTS, EGSnrc, and MCNP) are bench-marked against dose measurements in homogeneous (both low- and high-Z) media as well as at interfaces. A brief overview on physical models used by each code for photon and electron (positron) transport is given. Absolute calorimetric dose measurements for 0.5 and 1 MeV electron beams incident on homogeneous and multilayer media are compared with the predictions of the three codes. Comparison with dose measurements in two-layer media exposed to a 60Co gamma source is also performed. In addition, comparisons between the codes (including the EGS4 code) are done for (a) 0.05 to 10 MeV electron beams and positron point sources in lead, (b) high-energy photons (10 and 20 MeV) irradiating a multilayer phantom (water/steel/air), and (c) simulation of a 90Sr/90Y brachytherapy source. A good agreement is observed between the calorimetric electron dose measurements and predictions of GEPTS and EGSnrc in both homogeneous and multilayer media. MCNP outputs are found to be dependent on the energy-indexing method (Default/ITS style). This dependence is significant in homogeneous media as well as at interfaces. MCNP(ITS) fits more closely the experimental data than MCNP(DEF), except for the case of Be. At low energy (0.05 and 0.1 MeV), MCNP(ITS) dose distributions in lead show higher maximums in comparison with GEPTS and EGSnrc. EGS4 produces too penetrating electron-dose distributions in high-Z media, especially at low energy (<0.1 MeV). For positrons, differences between GEPTS and EGSnrc are observed in lead because GEPTS distinguishes positrons from electrons for both elastic multiple scattering and bremsstrahlung emission models. For the 60Co source, a quite good agreement between calculations and measurements is observed with regards to the experimental uncertainty. For the other cases (10 and 20 MeV photon sources and the 90Sr/90Y beta source), a good agreement is found between the three codes. In conclusion, differences between GEPTS and EGSnrc results are found to be very small for almost all media and energies studied. MCNP results depend significantly on the electron energy-indexing method.

  4. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  5. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    PubMed

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  6. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  7. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  8. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  9. Cohort-specific imputation of gene expression improves prediction of warfarin dose for African Americans.

    PubMed

    Gottlieb, Assaf; Daneshjou, Roxana; DeGorter, Marianne; Bourgeois, Stephane; Svensson, Peter J; Wadelius, Mia; Deloukas, Panos; Montgomery, Stephen B; Altman, Russ B

    2017-11-24

    Genome-wide association studies are useful for discovering genotype-phenotype associations but are limited because they require large cohorts to identify a signal, which can be population-specific. Mapping genetic variation to genes improves power and allows the effects of both protein-coding variation as well as variation in expression to be combined into "gene level" effects. Previous work has shown that warfarin dose can be predicted using information from genetic variation that affects protein-coding regions. Here, we introduce a method that improves dose prediction by integrating tissue-specific gene expression. In particular, we use drug pathways and expression quantitative trait loci knowledge to impute gene expression-on the assumption that differential expression of key pathway genes may impact dose requirement. We focus on 116 genes from the pharmacokinetic and pharmacodynamic pathways of warfarin within training and validation sets comprising both European and African-descent individuals. We build gene-tissue signatures associated with warfarin dose in a cohort-specific manner and identify a signature of 11 gene-tissue pairs that significantly augments the International Warfarin Pharmacogenetics Consortium dosage-prediction algorithm in both populations. Our results demonstrate that imputed expression can improve dose prediction and bridge population-specific compositions. MATLAB code is available at https://github.com/assafgo/warfarin-cohort.

  10. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less

  11. Computational Transport Modeling of High-Energy Neutrons Found in the Space Environment

    NASA Technical Reports Server (NTRS)

    Cox, Brad; Theriot, Corey A.; Rohde, Larry H.; Wu, Honglu

    2012-01-01

    The high charge and high energy (HZE) particle radiation environment in space interacts with spacecraft materials and the human body to create a population of neutrons encompassing a broad kinetic energy spectrum. As an HZE ion penetrates matter, there is an increasing chance of fragmentation as penetration depth increases. When an ion fragments, secondary neutrons are released with velocities up to that of the primary ion, giving some neutrons very long penetration ranges. These secondary neutrons have a high relative biological effectiveness, are difficult to effectively shield, and can cause more biological damage than the primary ions in some scenarios. Ground-based irradiation experiments that simulate the space radiation environment must account for this spectrum of neutrons. Using the Particle and Heavy Ion Transport Code System (PHITS), it is possible to simulate a neutron environment that is characteristic of that found in spaceflight. Considering neutron dosimetry, the focus lies on the broad spectrum of recoil protons that are produced in biological targets. In a biological target, dose at a certain penetration depth is primarily dependent upon recoil proton tracks. The PHITS code can be used to simulate a broad-energy neutron spectrum traversing biological targets, and it account for the recoil particle population. This project focuses on modeling a neutron beamline irradiation scenario for determining dose at increasing depth in water targets. Energy-deposition events and particle fluence can be simulated by establishing cross-sectional scoring routines at different depths in a target. This type of model is useful for correlating theoretical data with actual beamline radiobiology experiments. Other work exposed human fibroblast cells to a high-energy neutron source to study micronuclei induction in cells at increasing depth behind water shielding. Those findings provide supporting data describing dose vs. depth across a water-equivalent medium. This poster presents PHITS data suggesting an increase in dose, up to roughly 10 cm depth, followed by a continual decrease as neutrons come to a stop in the target.

  12. FLUKA simulation of TEPC response to cosmic radiation.

    PubMed

    Beck, P; Ferrari, A; Pelliccioni, M; Rollet, S; Villari, R

    2005-01-01

    The aircrew exposure to cosmic radiation can be assessed by calculation with codes validated by measurements. However, the relationship between doses in the free atmosphere, as calculated by the codes and from results of measurements performed within the aircraft, is still unclear. The response of a tissue-equivalent proportional counter (TEPC) has already been simulated successfully by the Monte Carlo transport code FLUKA. Absorbed dose rate and ambient dose equivalent rate distributions as functions of lineal energy have been simulated for several reference sources and mixed radiation fields. The agreement between simulation and measurements has been well demonstrated. In order to evaluate the influence of aircraft structures on aircrew exposure assessment, the response of TEPC in the free atmosphere and on-board is now simulated. The calculated results are discussed and compared with other calculations and measurements.

  13. SU-E-T-37: A GPU-Based Pencil Beam Algorithm for Dose Calculations in Proton Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalantzis, G; Leventouri, T; Tachibana, H

    Purpose: Recent developments in radiation therapy have been focused on applications of charged particles, especially protons. Over the years several dose calculation methods have been proposed in proton therapy. A common characteristic of all these methods is their extensive computational burden. In the current study we present for the first time, to our best knowledge, a GPU-based PBA for proton dose calculations in Matlab. Methods: In the current study we employed an analytical expression for the protons depth dose distribution. The central-axis term is taken from the broad-beam central-axis depth dose in water modified by an inverse square correction whilemore » the distribution of the off-axis term was considered Gaussian. The serial code was implemented in MATLAB and was launched on a desktop with a quad core Intel Xeon X5550 at 2.67GHz with 8 GB of RAM. For the parallelization on the GPU, the parallel computing toolbox was employed and the code was launched on a GTX 770 with Kepler architecture. The performance comparison was established on the speedup factors. Results: The performance of the GPU code was evaluated for three different energies: low (50 MeV), medium (100 MeV) and high (150 MeV). Four square fields were selected for each energy, and the dose calculations were performed with both the serial and parallel codes for a homogeneous water phantom with size 300×300×300 mm3. The resolution of the PBs was set to 1.0 mm. The maximum speedup of ∼127 was achieved for the highest energy and the largest field size. Conclusion: A GPU-based PB algorithm for proton dose calculations in Matlab was presented. A maximum speedup of ∼127 was achieved. Future directions of the current work include extension of our method for dose calculation in heterogeneous phantoms.« less

  14. 77 FR 42654 - Trifloxystrobin; Pesticide Tolerance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This... filing. III. Aggregate Risk Assessment and Determination of Safety Section 408(b)(2)(A)(i) of FFDCA... dose at which adverse effects of concern are identified (the LOAEL). Uncertainty/safety factors are...

  15. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safigholi, Habib; Meigooni, A S.; University of Nevada Las Vegas

    Purpose: Recently, different applicators are designed for treatment of the skin cancer such as scalp and legs, using Ir-192 HDR Brachytherapy Sources (IR-HDRS), Miniature Electronic Brachytherapy Sources (MEBXS), and External Electron Beam Radiation Therapy (EEBRT). Although, all of these methodologies may deliver the desired radiation dose to the skin, the dose to the underlying bone may become the limiting factor for selection of the optimum treatment technique. In this project the radiation dose delivered to the underlying bone has been evaluated as a function of the radiation source and thickness of the underlying bone. Methods: MC simulations were performed usingmore » MCNP5 code. In these simulations, the mono-energetic and non-divergent photon beams of 30 keV, 50 keV, and 70 keV for MEBXS, 380 keV photons for IR-HDRS, and 6 MeV mono-energetic electron beam for EEBRT were modeled. A 0.5 cm thick soft tissue (0.3 cm skin and 0.2 cm adipose) with underlying 0.5 cm cortical bone followed by 14 cm soft tissue are utilized for simulations. Results: Dose values to bone tissue as a function of beam energy and beam type, for a delivery of 5000 cGy dose to skin, were compared. These results indicate that for delivery of 5000 cGy dose to the skin surface with 30 keV, 50 keV, 70 keV of MEBXS, IR-HDRS, and EEBRT techniques, bone will receive 31750 cGy, 27450 cGy, 18550 cGy, 4875 cGy, and 10450 cGy, respectively. Conclusion: The results of these investigations indicate that, for delivery of the same skin dose, average doses received by the underlying bone are 5.2 and 2.2 times larger with a 50 keV MEBXS and EEBRT techniques than IR-HDRS, respectively.« less

  17. Simultaneous determination of equivalent dose to organs and tissues of the patient and of the physician in interventional radiology using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Bozkurt, A.; Bor, D.

    2007-01-01

    This study presents the results of computations of organ equivalent doses and effective doses for the patient and the primary physician during an interventional cardiological examination. The simulations were carried out for seven x-ray spectra (between 60 kVp and 120 kVp) using the Monte Carlo code MCNP. The voxel-based whole-body model VIP-Man was employed to represent both the patient and the physician, the former lying on the operation table while the latter standing 15 cm from the patient at about waist level behind a lead apron. The x-rays, which were generated by a point source positioned around the table and were directed with a conical distribution, irradiated the patient's heart under five major projections used in a coronary angiography examination. The mean effective doses under LAO45, PA, RAO30, LAO45/CAUD30 and LLAT irradiation conditions were calculated as 0.092, 0.163, 0.161, 0.133 and 0.118 mSv/(Gy cm2) for the patient and 1.153, 0.159, 0.145, 0.164 and 0.027 μSv/(Gy cm2) for the shielded physician. The effective doses for the patient determined in this study were usually lower than the literature data obtained through measurements and/or calculations and the discrepancies could be attributed to the fact that this study computes the effective doses specific to the VIP-Man body model, which lacks an ovarian contribution to the gonadal equivalent dose. The effective doses for the physician agreed reasonably well with the literature data.

  18. TU-EF-204-09: A Preliminary Method of Risk-Informed Optimization of Tube Current Modulation for Dose Reduction in CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Y; Liu, B; Kalra, M

    Purpose: X-rays from CT scans can increase cancer risk to patients. Lifetime Attributable Risk of Cancer Incidence for adult patients has been investigated and shown to decrease as patient age. However, a new risk model shows an increasing risk trend for several radiosensitive organs for middle age patients. This study investigates the feasibility of a general method for optimizing tube current modulation (TCM) functions to minimize risk by reducing radiation dose to radiosensitive organs of patients. Methods: Organ-based TCM has been investigated in literature for eye lens dose and breast dose. Adopting the concept in organ-based TCM, this study seeksmore » to find an optimized tube current for minimal total risk to breasts and lungs by reducing dose to these organs. The contributions of each CT view to organ dose are determined through simulations of CT scan view-by-view using a GPU-based fast Monte Carlo code, ARCHER. A Linear Programming problem is established for tube current optimization, with Monte Carlo results as weighting factors at each view. A pre-determined dose is used as upper dose boundary, and tube current of each view is optimized to minimize the total risk. Results: An optimized tube current is found to minimize the total risk of lungs and breasts: compared to fixed current, the risk is reduced by 13%, with breast dose reduced by 38% and lung dose reduced by 7%. The average tube current is maintained during optimization to maintain image quality. In addition, dose to other organs in chest region is slightly affected, with relative change in dose smaller than 10%. Conclusion: Optimized tube current plans can be generated to minimize cancer risk to lungs and breasts while maintaining image quality. In the future, various risk models and greater number of projections per rotation will be simulated on phantoms of different gender and age. National Institutes of Health R01EB015478.« less

  19. MONDO: A tracker for the characterization of secondary fast and ultrafast neutrons emitted in particle therapy

    NASA Astrophysics Data System (ADS)

    Mirabelli, R.; Battistoni, G.; Giacometti, V.; Patera, V.; Pinci, D.; Sarti, A.; Sciubba, A.; Traini, G.; Marafini, M.

    2018-01-01

    In Particle Therapy (PT) accelerated charged particles and light ions are used for treating tumors. One of the main limitation to the precision of PT is the emission of secondary particles due to the beam interaction with the patient: secondary emitted neutrons can release a significant dose far from the tumor. Therefore, a precise characterization of their flux, production energy and angle distribution is eagerly needed in order to improve the Treatment Planning Systems (TPS) codes. The principal aim of the MONDO (MOnitor for Neutron Dose in hadrOntherapy) project is the development of a tracking device optimized for the detection of fast and ultra-fast secondary neutrons emitted in PT. The detector consists of a matrix of scintillating square fibres coupled with a CMOS-based readout. Here, we present the characterization of the detector tracker prototype and CMOS-based digital SPAD (Single Photon Avalanche Diode) array sensor tested with protons at the Beam Test Facility (Frascati, Italy) and at the Proton Therapy Centre (Trento, Italy), respectively.

  20. Intercoder Reliability of Mapping Between Pharmaceutical Dose Forms in the German Medication Plan and EDQM Standard Terms.

    PubMed

    Sass, Julian; Becker, Kim; Ludmann, Dominik; Pantazoglou, Elisabeth; Dewenter, Heike; Thun, Sylvia

    2018-01-01

    A nationally uniform medication plan has recently been part of German legislation. The specification for the German medication plan was developed in cooperation between various stakeholders of the healthcare system. Its' goal is to enhance usability and interoperability while also providing patients and physicians with the necessary information they require for a safe and high-quality therapy. Within the research and development project named Medication Plan PLUS, the specification of the medication plan was tested and reviewed for semantic interoperability in particular. In this study, the list of pharmaceutical dose forms provided in the specification was mapped to the standard terms of the European Directorate for the Quality of Medicines & HealthCare by different coders. The level of agreement between coders was calculated using Cohen's Kappa (κ). Results show that less than half of the dose forms could be coded with EDQM standard terms. In addition to that Kappa was found to be moderate, which means rather unconvincing agreement among coders. In conclusion, there is still vast room for improvement in utilization of standardized international vocabulary and unused potential considering cross-border eHealth implementations in the future.

  1. Monte Carlo modeling of a conventional X-ray computed tomography scanner for gel dosimetry purposes.

    PubMed

    Hayati, Homa; Mesbahi, Asghar; Nazarpoor, Mahmood

    2016-01-01

    Our purpose in the current study was to model an X-ray CT scanner with the Monte Carlo (MC) method for gel dosimetry. In this study, a conventional CT scanner with one array detector was modeled with use of the MCNPX MC code. The MC calculated photon fluence in detector arrays was used for image reconstruction of a simple water phantom as well as polyacrylamide polymer gel (PAG) used for radiation therapy. Image reconstruction was performed with the filtered back-projection method with a Hann filter and the Spline interpolation method. Using MC results, we obtained the dose-response curve for images of irradiated gel at different absorbed doses. A spatial resolution of about 2 mm was found for our simulated MC model. The MC-based CT images of the PAG gel showed a reliable increase in the CT number with increasing absorbed dose for the studied gel. Also, our results showed that the current MC model of a CT scanner can be used for further studies on the parameters that influence the usability and reliability of results, such as the photon energy spectra and exposure techniques in X-ray CT gel dosimetry.

  2. The Los Alamos Supernova Light Curve Project: Current Projects and Future Directions

    NASA Astrophysics Data System (ADS)

    Wiggins, Brandon Kerry; Los Alamos Supernovae Research Group

    2015-01-01

    The Los Alamos Supernova Light Curve Project models supernovae in the ancient and modern universe to determine the luminosities of observability of certain supernovae events and to explore the physics of supernovae in the local universe. The project utilizes RAGE, Los Alamos' radiation hydrodynamics code to evolve the explosions of progenitors prepared in well-established stellar evolution codes. RAGE allows us to capture events such as shock breakout and collisions of ejecta with shells of material which cannot be modeled well in other codes. RAGE's dumps are then ported to LANL's SPECTRUM code which uses LANL's OPLIB opacities database to calculate light curves and spectra. In this paper, we summarize our recent work in modeling supernovae.

  3. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  4. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample

    PubMed Central

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-01-01

    Abstract To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site. PMID:29385528

  5. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample.

    PubMed

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-05-01

    To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site.

  6. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    DOE PAGES

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; ...

    2016-08-10

    We present that third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and rampedmore » operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. In conclusion, this made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.« less

  7. 42 CFR 137.328 - Must a construction project proposal incorporate provisions of Federal construction guidelines...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...

  8. 42 CFR 137.328 - Must a construction project proposal incorporate provisions of Federal construction guidelines...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...

  9. 42 CFR 137.328 - Must a construction project proposal incorporate provisions of Federal construction guidelines...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...

  10. 42 CFR 137.328 - Must a construction project proposal incorporate provisions of Federal construction guidelines...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...

  11. ARC-2001-ACD01-0018

    NASA Image and Video Library

    2001-02-16

    New Center Network Deployment ribbon Cutting: from left to right: Maryland Edwards, Code JT upgrade project deputy task manager; Ed Murphy, foundry networks systems engineer; Bohdan Cmaylo, Code JT upgrade project task manager, Scott Santiago, Division Chief, Code JT; Greg Miller, Raytheon Network engineer and Frank Daras, Raytheon network engineering manager.

  12. 42 CFR 137.328 - Must a construction project proposal incorporate provisions of Federal construction guidelines...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...

  13. Comparison of codes assessing galactic cosmic radiation exposure of aircraft crew.

    PubMed

    Bottollier-Depois, J F; Beck, P; Bennett, B; Bennett, L; Bütikofer, R; Clairand, I; Desorgher, L; Dyer, C; Felsberger, E; Flückiger, E; Hands, A; Kindl, P; Latocha, M; Lewis, B; Leuthold, G; Maczka, T; Mares, V; McCall, M J; O'Brien, K; Rollet, S; Rühm, W; Wissmann, F

    2009-10-01

    The assessment of the exposure to cosmic radiation onboard aircraft is one of the preoccupations of bodies responsible for radiation protection. Cosmic particle flux is significantly higher onboard aircraft than at ground level and its intensity depends on the solar activity. The dose is usually estimated using codes validated by the experimental data. In this paper, a comparison of various codes is presented, some of them are used routinely, to assess the dose received by the aircraft crew caused by the galactic cosmic radiation. Results are provided for periods close to solar maximum and minimum and for selected flights covering major commercial routes in the world. The overall agreement between the codes, particularly for those routinely used for aircraft crew dosimetry, was better than +/-20 % from the median in all but two cases. The agreement within the codes is considered to be fully satisfactory for radiation protection purposes.

  14. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, S.D.; Finch, S.M.

    1992-10-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The independent Technical Steering Panel (TSP) provides technical direction. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates):Source Terms, Environmental Transport, Environmental Monitoring Data, Demography, Food Consumption, and Agriculture, and Environmental Pathways and Dose Estimates.

  15. Monte Carlo dosimetric characterization of the Flexisource Co-60 high-dose-rate brachytherapy source using PENELOPE.

    PubMed

    Almansa, Julio F; Guerrero, Rafael; Torres, Javier; Lallena, Antonio M

    60 Co sources have been commercialized as an alternative to 192 Ir sources for high-dose-rate (HDR) brachytherapy. One of them is the Flexisource Co-60 HDR source manufactured by Elekta. The only available dosimetric characterization of this source is that of Vijande et al. [J Contemp Brachytherapy 2012; 4:34-44], whose results were not included in the AAPM/ESTRO consensus document. In that work, the dosimetric quantities were calculated as averages of the results obtained with the Geant4 and PENELOPE Monte Carlo (MC) codes, though for other sources, significant differences have been quoted between the values obtained with these two codes. The aim of this work is to perform the dosimetric characterization of the Flexisource Co-60 HDR source using PENELOPE. The MC simulation code PENELOPE (v. 2014) has been used. Following the recommendations of the AAPM/ESTRO report, the radial dose function, the anisotropy function, the air-kerma strength, the dose rate constant, and the absorbed dose rate in water have been calculated. The results we have obtained exceed those of Vijande et al. In particular, the absorbed dose rate constant is ∼0.85% larger. A similar difference is also found in the other dosimetric quantities. The effect of the electrons emitted in the decay of 60 Co, usually neglected in this kind of simulations, is significant up to the distances of 0.25 cm from the source. The systematic and significant differences we have found between PENELOPE results and the average values found by Vijande et al. point out that the dosimetric characterizations carried out with the various MC codes should be provided independently. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  16. Estimation of dose delivered to accelerator devices from stripping of 18.5 MeV/n 238U ions using the FLUKA code

    NASA Astrophysics Data System (ADS)

    Oranj, Leila Mokhtari; Lee, Hee-Seock; Leitner, Mario Santana

    2017-12-01

    In Korea, a heavy ion accelerator facility (RAON) has been designed for production of rare isotopes. The 90° bending section of this accelerator includes a 1.3- μm-carbon stripper followed by two dipole magnets and other devices. An incident beam is 18.5 MeV/n 238U33+,34+ ions passing through the carbon stripper at the beginning of the section. The two dipoles are tuned to transport 238U ions with specific charge states of 77+, 78+, 79+, 80+ and 81+. Then other ions will be deflected at the bends and cause beam losses. These beam losses are a concern to the devices of transport/beam line. The absorbed dose in devices and prompt dose in the tunnel were calculated using the FLUKA code in order to estimate radiation damage of such devices located at the 90° bending section and for the radiation protection. A novel method to transport multi-charged 238U ions beam was applied in the FLUKA code by using charge distribution of 238U ions after the stripper obtained from LISE++ code. The calculated results showed that the absorbed dose in the devices is influenced by the geometrical arrangement. The maximum dose was observed at the coils of first, second, fourth and fifth quadruples placed after first dipole magnet. The integrated doses for 30 years of operation with 9.5 p μA 238U ions were about 2 MGy for those quadrupoles. In conclusion, the protection of devices particularly, quadruples would be necessary to reduce the damage to devices. Moreover, results showed that the prompt radiation penetrated within the first 60 - 120 cm of concrete.

  17. Comparisons between MCNP, EGS4 and experiment for clinical electron beams.

    PubMed

    Jeraj, R; Keall, P J; Ostwald, P M

    1999-03-01

    Understanding the limitations of Monte Carlo codes is essential in order to avoid systematic errors in simulations, and to suggest further improvement of the codes. MCNP and EGS4, Monte Carlo codes commonly used in medical physics, were compared and evaluated against electron depth dose data and experimental backscatter results obtained using clinical radiotherapy beams. Different physical models and algorithms used in the codes give significantly different depth dose curves and electron backscattering factors. The default version of MCNP calculates electron depth dose curves which are too penetrating. The MCNP results agree better with experiment if the ITS-style energy-indexing algorithm is used. EGS4 underpredicts electron backscattering for high-Z materials. The results slightly improve if optimal PRESTA-I parameters are used. MCNP simulates backscattering well even for high-Z materials. To conclude the comparison, a timing study was performed. EGS4 is generally faster than MCNP and use of a large number of scoring voxels dramatically slows down the MCNP calculation. However, use of a large number of geometry voxels in MCNP only slightly affects the speed of the calculation.

  18. Use of Fluka to Create Dose Calculations

    NASA Technical Reports Server (NTRS)

    Lee, Kerry T.; Barzilla, Janet; Townsend, Lawrence; Brittingham, John

    2012-01-01

    Monte Carlo codes provide an effective means of modeling three dimensional radiation transport; however, their use is both time- and resource-intensive. The creation of a lookup table or parameterization from Monte Carlo simulation allows users to perform calculations with Monte Carlo results without replicating lengthy calculations. FLUKA Monte Carlo transport code was used to develop lookup tables and parameterizations for data resulting from the penetration of layers of aluminum, polyethylene, and water with areal densities ranging from 0 to 100 g/cm^2. Heavy charged ion radiation including ions from Z=1 to Z=26 and from 0.1 to 10 GeV/nucleon were simulated. Dose, dose equivalent, and fluence as a function of particle identity, energy, and scattering angle were examined at various depths. Calculations were compared against well-known results and against the results of other deterministic and Monte Carlo codes. Results will be presented.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faillace, E.R.; Cheng, J.J.; Yu, C.

    A series of benchmarking runs were conducted so that results obtained with the RESRAD code could be compared against those obtained with six pathway analysis models used to determine the radiation dose to an individual living on a radiologically contaminated site. The RESRAD computer code was benchmarked against five other computer codes - GENII-S, GENII, DECOM, PRESTO-EPA-CPG, and PATHRAE-EPA - and the uncodified methodology presented in the NUREG/CR-5512 report. Estimated doses for the external gamma pathway; the dust inhalation pathway; and the soil, food, and water ingestion pathways were calculated for each methodology by matching, to the extent possible, inputmore » parameters such as occupancy, shielding, and consumption factors.« less

  20. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  1. Knowledge-based iterative model reconstruction: comparative image quality and radiation dose with a pediatric computed tomography phantom.

    PubMed

    Ryu, Young Jin; Choi, Young Hun; Cheon, Jung-Eun; Ha, Seongmin; Kim, Woo Sun; Kim, In-One

    2016-03-01

    CT of pediatric phantoms can provide useful guidance to the optimization of knowledge-based iterative reconstruction CT. To compare radiation dose and image quality of CT images obtained at different radiation doses reconstructed with knowledge-based iterative reconstruction, hybrid iterative reconstruction and filtered back-projection. We scanned a 5-year anthropomorphic phantom at seven levels of radiation. We then reconstructed CT data with knowledge-based iterative reconstruction (iterative model reconstruction [IMR] levels 1, 2 and 3; Philips Healthcare, Andover, MA), hybrid iterative reconstruction (iDose(4), levels 3 and 7; Philips Healthcare, Andover, MA) and filtered back-projection. The noise, signal-to-noise ratio and contrast-to-noise ratio were calculated. We evaluated low-contrast resolutions and detectability by low-contrast targets and subjective and objective spatial resolutions by the line pairs and wire. With radiation at 100 peak kVp and 100 mAs (3.64 mSv), the relative doses ranged from 5% (0.19 mSv) to 150% (5.46 mSv). Lower noise and higher signal-to-noise, contrast-to-noise and objective spatial resolution were generally achieved in ascending order of filtered back-projection, iDose(4) levels 3 and 7, and IMR levels 1, 2 and 3, at all radiation dose levels. Compared with filtered back-projection at 100% dose, similar noise levels were obtained on IMR level 2 images at 24% dose and iDose(4) level 3 images at 50% dose, respectively. Regarding low-contrast resolution, low-contrast detectability and objective spatial resolution, IMR level 2 images at 24% dose showed comparable image quality with filtered back-projection at 100% dose. Subjective spatial resolution was not greatly affected by reconstruction algorithm. Reduced-dose IMR obtained at 0.92 mSv (24%) showed similar image quality to routine-dose filtered back-projection obtained at 3.64 mSv (100%), and half-dose iDose(4) obtained at 1.81 mSv.

  2. Hanford Environmental Dose Reconstruction Project. Monthly report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, S.D.; Finch, S.M.

    1992-10-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The independent Technical Steering Panel (TSP) provides technical direction. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates):Source Terms, Environmental Transport, Environmental Monitoring Data, Demography, Food Consumption, and Agriculture, and Environmental Pathways and Dose Estimates.

  3. Integrating Bar-Code Medication Administration Competencies in the Curriculum: Implications for Nursing Education and Interprofessional Collaboration.

    PubMed

    Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L

    This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.

  4. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code.

    PubMed

    Hadad, K; Zohrevand, M; Faghihi, R; Sedighi Pashaki, A

    2015-03-01

    HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry.

  5. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code

    PubMed Central

    Hadad, K.; Zohrevand, M.; Faghihi, R.; Sedighi Pashaki, A.

    2015-01-01

    Background HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. Materials and Methods In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. Results The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Conclusion Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry. PMID:25973408

  6. A Coded Structured Light System Based on Primary Color Stripe Projection and Monochrome Imaging

    PubMed Central

    Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano

    2013-01-01

    Coded Structured Light techniques represent one of the most attractive research areas within the field of optical metrology. The coding procedures are typically based on projecting either a single pattern or a temporal sequence of patterns to provide 3D surface data. In this context, multi-slit or stripe colored patterns may be used with the aim of reducing the number of projected images. However, color imaging sensors require the use of calibration procedures to address crosstalk effects between different channels and to reduce the chromatic aberrations. In this paper, a Coded Structured Light system has been developed by integrating a color stripe projector and a monochrome camera. A discrete coding method, which combines spatial and temporal information, is generated by sequentially projecting and acquiring a small set of fringe patterns. The method allows the concurrent measurement of geometrical and chromatic data by exploiting the benefits of using a monochrome camera. The proposed methodology has been validated by measuring nominal primitive geometries and free-form shapes. The experimental results have been compared with those obtained by using a time-multiplexing gray code strategy. PMID:24129018

  7. A coded structured light system based on primary color stripe projection and monochrome imaging.

    PubMed

    Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano

    2013-10-14

    Coded Structured Light techniques represent one of the most attractive research areas within the field of optical metrology. The coding procedures are typically based on projecting either a single pattern or a temporal sequence of patterns to provide 3D surface data. In this context, multi-slit or stripe colored patterns may be used with the aim of reducing the number of projected images. However, color imaging sensors require the use of calibration procedures to address crosstalk effects between different channels and to reduce the chromatic aberrations. In this paper, a Coded Structured Light system has been developed by integrating a color stripe projector and a monochrome camera. A discrete coding method, which combines spatial and temporal information, is generated by sequentially projecting and acquiring a small set of fringe patterns. The method allows the concurrent measurement of geometrical and chromatic data by exploiting the benefits of using a monochrome camera. The proposed methodology has been validated by measuring nominal primitive geometries and free-form shapes. The experimental results have been compared with those obtained by using a time-multiplexing gray code strategy.

  8. SeisCode: A seismological software repository for discovery and collaboration

    NASA Astrophysics Data System (ADS)

    Trabant, C.; Reyes, C. G.; Clark, A.; Karstens, R.

    2012-12-01

    SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their software packages. IRIS invites the seismological community to browse and to submit projects to https://seiscode.iris.washington.edu/

  9. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    NASA Astrophysics Data System (ADS)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  10. Application of the Scoping of Options and Analyzing Risk Model (SOAR) in the Modeling of Radionuclide Movement and Dose Calculation for a Hypothetical Deep Geological Repository for Nuclear Fuel Wastes

    NASA Astrophysics Data System (ADS)

    Lei, S.; Osborne, P.

    2016-12-01

    The Scoping of Options and Analyzing Risk (SOAR) model was developed by the U.S. Nuclear Regulatory Commission staff to assist in their evaluation of potential high-level radioactive waste disposal options. It is a 1-D contaminant transport code that contains a biosphere module to calculate mass fluxes and radiation dose to humans. As part of the Canadian Nuclear Safety Commission (CNSC)'s Coordinated Assessment Program to assist with the review of proposals for deep geological repositories (DGR's) for nuclear fuel wastes, CNSC conducted a research project to find out whether SOAR can be used by CNSC staff as an independent scoping tool to assist review of proponents' submissions related to safety assessment for DGRs. In the research, SOAR was applied to the post-closure safety assessment for a hypothetical DGR in sedimentary rock, as described in the 5th Case Study report by the Nuclear Waste Management Organization (NWMO) of Canada (2011). The report contains, among others, modeling of transport and releases of radionuclides at various locations within the geosphere and the radiation dose to humans over a period of one million years. One aspect covered was 1-D modeling of various scenarios and sensitivity cases with both deterministic and probabilistic approaches using SYVAC3-CC4, which stands for Systems Variability Analysis Code (generation 3, Canadian Concept generation 4), developed by Atomic Energy of Canada Limited (Kitson et al., 2000). Radionuclide fluxes and radiation dose to the humans calculated using SOAR were compared with that from NWMO's modeling. Overall, the results from the two models were similar, although SOAR gave lower mass fluxes and peak dose, mainly due to differences in modeling the waste package configurations. Sensitivity analyses indicate that both models are most sensitive to the diffusion coefficient of the geological media. The research leads to the conclusion that SOAR is a robust, user friendly, and flexible scoping tool that CNSC staff may use for safety assessments; however, some improvements may be needed, such as including dose contributions from other pathways in addition to drinking water and being more flexible for modeling different waste package configurations.

  11. WE-DE-201-05: Evaluation of a Windowless Extrapolation Chamber Design and Monte Carlo Based Corrections for the Calibration of Ophthalmic Applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, J; Culberson, W; DeWerd, L

    Purpose: To test the validity of a windowless extrapolation chamber used to measure surface dose rate from planar ophthalmic applicators and to compare different Monte Carlo based codes for deriving correction factors. Methods: Dose rate measurements were performed using a windowless, planar extrapolation chamber with a {sup 90}Sr/{sup 90}Y Tracerlab RA-1 ophthalmic applicator previously calibrated at the National Institute of Standards and Technology (NIST). Capacitance measurements were performed to estimate the initial air gap width between the source face and collecting electrode. Current was measured as a function of air gap, and Bragg-Gray cavity theory was used to calculate themore » absorbed dose rate to water. To determine correction factors for backscatter, divergence, and attenuation from the Mylar entrance window found in the NIST extrapolation chamber, both EGSnrc Monte Carlo user code and Monte Carlo N-Particle Transport Code (MCNP) were utilized. Simulation results were compared with experimental current readings from the windowless extrapolation chamber as a function of air gap. Additionally, measured dose rate values were compared with the expected result from the NIST source calibration to test the validity of the windowless chamber design. Results: Better agreement was seen between EGSnrc simulated dose results and experimental current readings at very small air gaps (<100 µm) for the windowless extrapolation chamber, while MCNP results demonstrated divergence at these small gap widths. Three separate dose rate measurements were performed with the RA-1 applicator. The average observed difference from the expected result based on the NIST calibration was −1.88% with a statistical standard deviation of 0.39% (k=1). Conclusion: EGSnrc user code will be used during future work to derive correction factors for extrapolation chamber measurements. Additionally, experiment results suggest that an entrance window is not needed in order for an extrapolation chamber to provide accurate dose rate measurements for a planar ophthalmic applicator.« less

  12. Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.

  13. Influence of standing positions and beam projections on effective dose and eye lens dose of anaesthetists in interventional procedures.

    PubMed

    Kong, Y; Struelens, L; Vanhavere, F; Vargas, C S; Schoonjans, W; Zhuo, W H

    2015-02-01

    More and more anaesthetists are getting involved in interventional radiology procedures and so it is important to know the radiation dose and to optimise protection for anaesthetists. In this study, based on Monte Carlo simulations and field measurements, both the whole-body doses and eye lens dose of anaesthetists were studied. The results showed that the radiation exposure to anaesthetists not only depends on their workload, but also largely varies with their standing positions and beam projections during interventional procedures. The simulation results showed that the effective dose to anaesthetists may vary with their standing positions and beam projections to more than a factor of 10, and the eye lens dose may vary with the standing positions and beam projections to more than a factor of 200. In general, a close position to the bed and the left lateral (LLAT) beam projection will bring a high exposure to anaesthetists. Good correlations between the eye lens dose and the doses at the neck, chest and waist over the apron were observed from the field measurements. The results indicate that adequate arrangements of anaesthesia device or other monitoring equipment in the fluoroscopy rooms are useful measures to reduce the radiation exposure to anaesthetists, and anaesthetists should be aware that they will receive the highest doses under left lateral beam projection. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Calculations vs. measurements of remnant dose rates for SNS spent structures

    NASA Astrophysics Data System (ADS)

    Popova, I. I.; Gallmeier, F. X.; Trotter, S.; Dayton, M.

    2018-06-01

    Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction. Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.

  15. Calculations vs. measurements of remnant dose rates for SNS spent structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popova, Irina I.; Gallmeier, Franz X.; Trotter, Steven M.

    Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction.more » Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.« less

  16. Characterisation of an anthropomorphic chest phantom for dose measurements in radiology beams

    NASA Astrophysics Data System (ADS)

    Henriques, L. M. S.; Cerqueira, R. A. D.; Santos, W. S.; Pereira, A. J. S.; Rodrigues, T. M. A.; Carvalho Júnior, A. B.; Maia, A. F.

    2014-02-01

    The objective of this study was to characterise an anthropomorphic chest phantom for dosimetric measurements of conventional radiology beams. This phantom was developed by a previous research project at the Federal University of Sergipe for image quality control tests. As the phantom consists of tissue-equivalent material, it is possible to characterise it for dosimetric studies. For comparison, a geometric chest phantom, consisting of PMMA (polymethylmethacrylate) with dimensions of 30×30×15 cm³ was used. Measurements of incident air kerma (Ki) and entrance surface dose (ESD) were performed using ionisation chambers. From the results, backscatter factors (BSFs) of the two phantoms were determined and compared with values estimated by CALDose_X software, based on a Monte Carlo simulation. For the technical parameters evaluated in this study, the ESD and BSF values obtained experimentally showed a good similarity between the two phantoms, with minimum and maximum difference of 0.2% and 7.0%, respectively, and showed good agreement with the results published in the literature. Organ doses and effective doses for the anthropomorphic phantom were also estimated by the determination of conversion coefficients (CCs) using the visual Monte Carlo (VMC) code. Therefore, the results of this study prove that the anthropomorphic thorax phantom proposed is a good tool to use in dosimetry and can be used for risk evaluation of X-ray diagnostic procedures.

  17. Study on radiation production in the charge stripping section of the RISP linear accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Oranj, Leila Mokhtari; Lee, Hee-Seock; Ko, Seung-Kook

    2015-02-01

    The linear accelerator of the Rare Isotope Science Project (RISP) accelerates 200 MeV/nucleon 238U ions in a multi-charge states. Many kinds of radiations are generated while the primary beam is transported along the beam line. The stripping process using thin carbon foil leads to complicated radiation environments at the 90-degree bending section. The charge distribution of 238U ions after the carbon charge stripper was calculated by using the LISE++ program. The estimates of the radiation environments were carried out by using the well-proved Monte Carlo codes PHITS and FLUKA. The tracks of 238U ions in various charge states were identified using the magnetic field subroutine of the PHITS code. The dose distribution caused by U beam losses for those tracks was obtained over the accelerator tunnel. A modified calculation was applied for tracking the multi-charged U beams because the fundamental idea of PHITS and FLUKA was to transport fully-ionized ion beam. In this study, the beam loss pattern after a stripping section was observed, and the radiation production by heavy ions was studied. Finally, the performance of the PHITS and the FLUKA codes was validated for estimating the radiation production at the stripping section by applying a modified method.

  18. Latent uncertainties of the precalculated track Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, Marc-André; Seuntjens, Jan; Roberge, David

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited numbermore » of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. Conclusions: The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.« less

  19. Latent uncertainties of the precalculated track Monte Carlo method.

    PubMed

    Renaud, Marc-André; Roberge, David; Seuntjens, Jan

    2015-01-01

    While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Particle tracks were pregenerated for electrons and protons using EGSnrc and geant4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (cuda) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a "ground truth" benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of Dmax. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤ 1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.

  20. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    PubMed Central

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2018-01-01

    The goal of this study is to develop a generalized source model (GSM) for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology. PMID:28079526

  1. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    NASA Astrophysics Data System (ADS)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-01

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  2. SU-E-I-36: A KWIC and Dirty Look at Dose Savings and Perfusion Metrics in Simulated CT Neuro Perfusion Exams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, J; Martin, T; Young, S

    Purpose: CT neuro perfusion scans are one of the highest dose exams. Methods to reduce dose include decreasing the number of projections acquired per gantry rotation, however conventional reconstruction of such scans leads to sampling artifacts. In this study we investigated a projection view-sharing reconstruction algorithm used in dynamic MRI – “K-space Weighted Image Contrast” (KWIC) – applied to simulated perfusion exams and evaluated dose savings and impacts on perfusion metrics. Methods: A FORBILD head phantom containing simulated time-varying objects was developed and a set of parallel-beam CT projection data was created. The simulated scans were 60 seconds long, 1152more » projections per turn, with a rotation time of one second. No noise was simulated. 5mm, 10mm, and 50mm objects were modeled in the brain. A baseline, “full dose” simulation used all projections and reduced dose cases were simulated by downsampling the number of projections per turn from 1152 to 576 (50% dose), 288 (25% dose), and 144 (12.5% dose). KWIC was further evaluated at 72 projections per rotation (6.25%). One image per second was reconstructed using filtered backprojection (FBP) and KWIC. KWIC reconstructions utilized view cores of 36, 72, 144, and 288 views and 16, 8, 4, and 2 subapertures respectively. From the reconstructed images, time-to-peak (TTP), cerebral blood flow (CBF) and the FWHM of the perfusion curve were calculated and compared against reference values from the full-dose FBP data. Results: TTP, CBF, and the FWHM were unaffected by dose reduction (to 12.5%) and reconstruction method, however image quality was improved when using KWIC. Conclusion: This pilot study suggests that KWIC preserves image quality and perfusion metrics when under-sampling projections and that the unique contrast weighting of KWIC could provided substantial dose-savings for perfusion CT scans. Evaluation of KWIC in clinical CT data will be performed in the near future. R01 EB014922, NCI Grant U01 CA181156 (Quantitative Imaging Network), and Tobacco Related Disease Research Project grant 22RT-0131.« less

  3. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  4. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  5. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  6. Evaluation of the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels using particle and heavy ion transport code system: PHITS.

    PubMed

    Shiiba, Takuro; Kuga, Naoya; Kuroiwa, Yasuyoshi; Sato, Tatsuhiko

    2017-10-01

    We assessed the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels (DPKs) calculated using the particle and heavy ion transport code system (PHITS) for patient-specific dosimetry in targeted radionuclide treatment (TRT) and compared our data with published data. All mono-energetic and beta-emitting isotope DPKs calculated using PHITS, both in water and compact bone, were in good agreement with those in literature using other MC codes. PHITS provided reliable mono-energetic electron and beta-emitting isotope scaled DPKs for patient-specific dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Restoration of low-dose digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Borges, Lucas R.; Azzari, Lucio; Bakic, Predrag R.; Maidment, Andrew D. A.; Vieira, Marcelo A. C.; Foi, Alessandro

    2018-06-01

    In breast cancer screening, the radiation dose must be kept to the minimum necessary to achieve the desired diagnostic objective, thus minimizing risks associated with cancer induction. However, decreasing the radiation dose also degrades the image quality. In this work we restore digital breast tomosynthesis (DBT) projections acquired at low radiation doses with the goal of achieving a quality comparable to that obtained from current standard full-dose imaging protocols. A multiframe denoising algorithm was applied to low-dose projections, which are filtered jointly. Furthermore, a weighted average was used to inject a varying portion of the noisy signal back into the denoised one, in order to attain a signal-to-noise ratio comparable to that of standard full-dose projections. The entire restoration framework leverages a signal-dependent noise model with quantum gain which varies both upon the projection angle and on the pixel position. A clinical DBT system and a 3D anthropomorphic breast phantom were used to validate the proposed method, both on DBT projections and slices from the 3D reconstructed volume. The framework is shown to attain the standard full-dose image quality from data acquired at 50% lower radiation dose, whereas progressive loss of relevant details compromises the image quality if the dosage is further decreased.

  8. SOC-DS computer code provides tool for design evaluation of homogeneous two-material nuclear shield

    NASA Technical Reports Server (NTRS)

    Disney, R. K.; Ricks, L. O.

    1967-01-01

    SOC-DS Code /Shield Optimization Code-Direc Search/, selects a nuclear shield material of optimum volume, weight, or cost to meet the requirments of a given radiation dose rate or energy transmission constraint. It is applicable to evaluating neutron and gamma ray shields for all nuclear reactors.

  9. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.

  10. Comparison of depth-dose distributions of proton therapeutic beams calculated by means of logical detectors and ionization chamber modeled in Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej

    2016-08-01

    The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.

  11. Comparison of different approaches of estimating effective dose from reported exposure data in 3D imaging with interventional fluoroscopy systems

    NASA Astrophysics Data System (ADS)

    Svalkvist, Angelica; Hansson, Jonny; Bâth, Magnus

    2014-03-01

    Three-dimensional (3D) imaging with interventional fluoroscopy systems is today a common examination. The examination includes acquisition of two-dimensional projection images, used to reconstruct section images of the patient. The aim of the present study was to investigate the difference in resulting effective dose obtained using different levels of complexity in calculations of effective doses from these examinations. In the study the Siemens Artis Zeego interventional fluoroscopy system (Siemens Medical Solutions, Erlangen, Germany) was used. Images of anthropomorphic chest and pelvis phantoms were acquired. The exposure values obtained were used to calculate the resulting effective doses from the examinations, using the computer software PCXMC (STUK, Helsinki, Finland). The dose calculations were performed using three different methods: 1. using individual exposure values for each projection image, 2. using the mean tube voltage and the total DAP value, evenly distributed over the projection images, and 3. using the mean kV and the total DAP value, evenly distributed over smaller selection of projection images. The results revealed that the difference in resulting effective dose between the first two methods was smaller than 5%. When only a selection of projection images were used in the dose calculations the difference increased to over 10%. Given the uncertainties associated with the effective dose concept, the results indicate that dose calculations based on average exposure values distributed over a smaller selection of projection angles can provide reasonably accurate estimations of the radiation doses from 3D imaging using interventional fluoroscopy systems.

  12. Neutron dose measurements of Varian and Elekta linacs by TLD600 and TLD700 dosimeters and comparison with MCNP calculations

    PubMed Central

    Nedaie, Hassan Ali; Darestani, Hoda; Banaee, Nooshin; Shagholi, Negin; Mohammadi, Kheirollah; Shahvar, Arjang; Bayat, Esmaeel

    2014-01-01

    High-energy linacs produce secondary particles such as neutrons (photoneutron production). The neutrons have the important role during treatment with high energy photons in terms of protection and dose escalation. In this work, neutron dose equivalents of 18 MV Varian and Elekta accelerators are measured by thermoluminescent dosimeter (TLD) 600 and TLD700 detectors and compared with the Monte Carlo calculations. For neutron and photon dose discrimination, first TLDs were calibrated separately by gamma and neutron doses. Gamma calibration was carried out in two procedures; by standard 60Co source and by 18 MV linac photon beam. For neutron calibration by 241Am-Be source, irradiations were performed in several different time intervals. The Varian and Elekta linac heads and the phantom were simulated by the MCNPX code (v. 2.5). Neutron dose equivalent was calculated in the central axis, on the phantom surface and depths of 1, 2, 3.3, 4, 5, and 6 cm. The maximum photoneutron dose equivalents which calculated by the MCNPX code were 7.06 and 2.37 mSv.Gy-1 for Varian and Elekta accelerators, respectively, in comparison with 50 and 44 mSv.Gy-1 achieved by TLDs. All the results showed more photoneutron production in Varian accelerator compared to Elekta. According to the results, it seems that TLD600 and TLD700 pairs are not suitable dosimeters for neutron dosimetry inside the linac field due to high photon flux, while MCNPX code is an appropriate alternative for studying photoneutron production. PMID:24600167

  13. Neutron dose measurements of Varian and Elekta linacs by TLD600 and TLD700 dosimeters and comparison with MCNP calculations.

    PubMed

    Nedaie, Hassan Ali; Darestani, Hoda; Banaee, Nooshin; Shagholi, Negin; Mohammadi, Kheirollah; Shahvar, Arjang; Bayat, Esmaeel

    2014-01-01

    High-energy linacs produce secondary particles such as neutrons (photoneutron production). The neutrons have the important role during treatment with high energy photons in terms of protection and dose escalation. In this work, neutron dose equivalents of 18 MV Varian and Elekta accelerators are measured by thermoluminescent dosimeter (TLD) 600 and TLD700 detectors and compared with the Monte Carlo calculations. For neutron and photon dose discrimination, first TLDs were calibrated separately by gamma and neutron doses. Gamma calibration was carried out in two procedures; by standard 60Co source and by 18 MV linac photon beam. For neutron calibration by (241)Am-Be source, irradiations were performed in several different time intervals. The Varian and Elekta linac heads and the phantom were simulated by the MCNPX code (v. 2.5). Neutron dose equivalent was calculated in the central axis, on the phantom surface and depths of 1, 2, 3.3, 4, 5, and 6 cm. The maximum photoneutron dose equivalents which calculated by the MCNPX code were 7.06 and 2.37 mSv.Gy(-1) for Varian and Elekta accelerators, respectively, in comparison with 50 and 44 mSv.Gy(-1) achieved by TLDs. All the results showed more photoneutron production in Varian accelerator compared to Elekta. According to the results, it seems that TLD600 and TLD700 pairs are not suitable dosimeters for neutron dosimetry inside the linac field due to high photon flux, while MCNPX code is an appropriate alternative for studying photoneutron production.

  14. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, Charles J.; Shi, Xizeng

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less

  15. Monte Carlo calculations for reporting patient organ doses from interventional radiology

    NASA Astrophysics Data System (ADS)

    Huo, Wanli; Feng, Mang; Pi, Yifei; Chen, Zhi; Gao, Yiming; Xu, X. George

    2017-09-01

    This paper describes a project to generate organ dose data for the purposes of extending VirtualDose software from CT imaging to interventional radiology (IR) applications. A library of 23 mesh-based anthropometric patient phantoms were involved in Monte Carlo simulations for database calculations. Organ doses and effective doses of IR procedures with specific beam projection, filed of view (FOV) and beam quality for all parts of body were obtained. Comparing organ doses for different beam qualities, beam projections, patients' ages and patient's body mass indexes (BMIs) which generated by VirtualDose-IR, significant discrepancies were observed. For relatively long time exposure, IR doses depend on beam quality, beam direction and patient size. Therefore, VirtualDose-IR, which is based on the latest anatomically realistic patient phantoms, can generate accurate doses for IR treatment. It is suitable to apply this software in clinical IR dose management as an effective tool to estimate patient doses and optimize IR treatment plans.

  16. Monitoring Cosmic Radiation Risk: Comparisons between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-01-01

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and...radiation transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the...same dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6

  17. Monitoring Cosmic Radiation Risk: Comparisons Between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-07-05

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and Heavy...transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the input...dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6 (PARMA

  18. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  19. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H; Gao, Y; Liu, T

    Purpose: To develop quantitative clinical guidelines between supine Deep Inspiratory Breath Hold (DIBH) and prone free breathing treatments for breast patients, we applied 3D deformable phantoms to perform Monte Carlo simulation to predict corresponding Dose to the Organs at Risk (OARs). Methods: The RPI-adult female phantom (two selected cup sizes: A and D) was used to represent the female patient, and it was simulated using the MCNP6 Monte Carlo code. Doses to OARs were investigated for supine DIBH and prone treatments, considering two breast sizes. The fluence maps of the 6-MV opposed tangential fields were exported. In the Monte Carlomore » simulation, the fluence maps allow each simulated photon particle to be weighed in the final dose calculation. The relative error of all dose calculations was kept below 5% by simulating 3*10{sup 7} photons for each projection. Results: In terms of dosimetric accuracy, the RPI Adult Female phantom with cup size D in DIBH positioning matched with a DIBH treatment plan of the patient. Based on the simulation results, for cup size D phantom, prone positioning reduced the cardiac dose and the dose to other OARs, while cup size A phantom benefits more from DIBH positioning. Comparing simulation results for cup size A and D phantom, dose to OARs was generally higher for the large breast size due to increased scattering arising from a larger portion of the body in the primary beam. The lower dose that was registered for the heart in the large breast phantom in prone positioning was due to the increase of the distance between the heart and the primary beam when the breast was pendulous. Conclusion: Our 3D deformable phantom appears an excellent tool to predict dose to the OARs for the supine DIBH and prone positions, which might help quantitative clinical decisions. Further investigation will be conducted. National Institutes of Health R01EB015478.« less

  1. Calculations of the skyshine gamma-ray dose rates from independent spent fuel storage installations (ISFSI) under worst case accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pace, J.V. III; Cramer, S.N.; Knight, J.R.

    1980-09-01

    Calculations of the skyshine gamma-ray dose rates from three spent fuel storage pools under worst case accident conditions have been made using the discrete ordinates code DOT-IV and the Monte Carlo code MORSE and have been compared to those of two previous methods. The DNA 37N-21G group cross-section library was utilized in the calculations, together with the Claiborne-Trubey gamma-ray dose factors taken from the same library. Plots of all results are presented. It was found that the dose was a strong function of the iron thickness over the fuel assemblies, the initial angular distribution of the emitted radiation, and themore » photon source near the top of the assemblies. 16 refs., 11 figs., 7 tabs.« less

  2. An electron-beam dose deposition experiment: TIGER 1-D simulation code versus thermoluminescent dosimetry

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Tipton, Charles W.; Self, Charles T.

    1991-03-01

    The dose absorbed in an integrated circuit (IC) die exposed to a pulse of low-energy electrons is a strong function of both electron energy and surrounding packaging materials. This report describes an experiment designed to measure how well the Integrated TIGER Series one-dimensional (1-D) electron transport simulation program predicts dose correction factors for a state-of-the-art IC package and package/printed circuit board (PCB) combination. These derived factors are compared with data obtained experimentally using thermoluminescent dosimeters (TLD's) and the FX-45 flash x-ray machine (operated in electron-beam (e-beam) mode). The results of this experiment show that the TIGER 1-D simulation code can be used to accurately predict FX-45 e-beam dose deposition correction factors for reasonably complex IC packaging configurations.

  3. Proton irradiation on materials

    NASA Technical Reports Server (NTRS)

    Chang, C. Ken

    1993-01-01

    A computer code is developed by utilizing a radiation transport code developed at NASA Langley Research Center to study the proton radiation effects on materials which have potential application in NASA's future space missions. The code covers the proton energy from 0.01 Mev to 100 Gev and is sufficient for energetic protons encountered in both low earth and geosynchronous orbits. With some modification, the code can be extended for particles heavier than proton as the radiation source. The code is capable of calculating the range, stopping power, exit energy, energy deposition coefficients, dose, and cumulative dose along the path of the proton in a target material. The target material can be any combination of the elements with atomic number ranging from 1 to 92, or any compound with known chemical composition. The generated cross section for a material is stored and is reused in future to save computer time. This information can be utilized to calculate the proton dose a material would receive in an orbit when the radiation environment is known. It can also be used to determine, in the laboratory, the parameters such as beam current of proton and irradiation time to attain the desired dosage for accelerated ground testing of any material. It is hoped that the present work be extended to include polymeric and composite materials which are prime candidates for use as coating, electronic components, and structure building. It is also desirable to determine, for ground testing these materials, the laboratory parameters in order to simulate the dose they would receive in space environments. A sample print-out for water subject to 1.5 Mev proton is included as a reference.

  4. Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.

    PubMed

    Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A

    2004-02-07

    The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.

  5. Low-dose exposure to bisphenols A, F and S of human primary adipocyte impacts coding and non-coding RNA profiles

    PubMed Central

    Leloire, Audrey; Dhennin, Véronique; Coumoul, Xavier; Yengo, Loïc; Froguel, Philippe

    2017-01-01

    Bisphenol A (BPA) exposure has been suspected to be associated with deleterious effects on health including obesity and metabolically-linked diseases. Although bisphenols F (BPF) and S (BPS) are BPA structural analogs commonly used in many marketed products as a replacement for BPA, only sparse toxicological data are available yet. Our objective was to comprehensively characterize bisphenols gene targets in a human primary adipocyte model, in order to determine whether they may induce cellular dysfunction, using chronic exposure at two concentrations: a “low-dose” similar to the dose usually encountered in human biological fluids and a higher dose. Therefore, BPA, BPF and BPS have been added at 10 nM or 10 μM during the differentiation of human primary adipocytes from subcutaneous fat of three non-diabetic Caucasian female patients. Gene expression (mRNA/lncRNA) arrays and microRNA arrays, have been used to assess coding and non-coding RNA changes. We detected significantly deregulated mRNA/lncRNA and miRNA at low and high doses. Enrichment in “cancer” and “organismal injury and abnormalities” related pathways was found in response to the three products. Some long intergenic non-coding RNAs and small nucleolar RNAs were differentially expressed suggesting that bisphenols may also activate multiple cellular processes and epigenetic modifications. The analysis of upstream regulators of deregulated genes highlighted hormones or hormone-like chemicals suggesting that BPS and BPF can be suspected to interfere, just like BPA, with hormonal regulation and have to be considered as endocrine disruptors. All these results suggest that as BPA, its substitutes BPS and BPF should be used with the same restrictions. PMID:28628672

  6. A Dictionary Learning Approach with Overlap for the Low Dose Computed Tomography Reconstruction and Its Vectorial Application to Differential Phase Tomography

    PubMed Central

    Mirone, Alessandro; Brun, Emmanuel; Coan, Paola

    2014-01-01

    X-ray based Phase-Contrast Imaging (PCI) techniques have been demonstrated to enhance the visualization of soft tissues in comparison to conventional imaging methods. Nevertheless the delivered dose as reported in the literature of biomedical PCI applications often equals or exceeds the limits prescribed in clinical diagnostics. The optimization of new computed tomography strategies which include the development and implementation of advanced image reconstruction procedures is thus a key aspect. In this scenario, we implemented a dictionary learning method with a new form of convex functional. This functional contains in addition to the usual sparsity inducing and fidelity terms, a new term which forces similarity between overlapping patches in the superimposed regions. The functional depends on two free regularization parameters: a coefficient multiplying the sparsity-inducing norm of the patch basis functions coefficients, and a coefficient multiplying the norm of the differences between patches in the overlapping regions. The solution is found by applying the iterative proximal gradient descent method with FISTA acceleration. The gradient is computed by calculating projection of the solution and its error backprojection at each iterative step. We study the quality of the solution, as a function of the regularization parameters and noise, on synthetic data for which the solution is a-priori known. We apply the method on experimental data in the case of Differential Phase Tomography. For this case we use an original approach which consists in using vectorial patches, each patch having two components: one per each gradient component. The resulting algorithm, implemented in the European Synchrotron Radiation Facility tomography reconstruction code PyHST, has proven to be efficient and well-adapted to strongly reduce the required dose and the number of projections in medical tomography. PMID:25531987

  7. A dictionary learning approach with overlap for the low dose computed tomography reconstruction and its vectorial application to differential phase tomography.

    PubMed

    Mirone, Alessandro; Brun, Emmanuel; Coan, Paola

    2014-01-01

    X-ray based Phase-Contrast Imaging (PCI) techniques have been demonstrated to enhance the visualization of soft tissues in comparison to conventional imaging methods. Nevertheless the delivered dose as reported in the literature of biomedical PCI applications often equals or exceeds the limits prescribed in clinical diagnostics. The optimization of new computed tomography strategies which include the development and implementation of advanced image reconstruction procedures is thus a key aspect. In this scenario, we implemented a dictionary learning method with a new form of convex functional. This functional contains in addition to the usual sparsity inducing and fidelity terms, a new term which forces similarity between overlapping patches in the superimposed regions. The functional depends on two free regularization parameters: a coefficient multiplying the sparsity-inducing L1 norm of the patch basis functions coefficients, and a coefficient multiplying the L2 norm of the differences between patches in the overlapping regions. The solution is found by applying the iterative proximal gradient descent method with FISTA acceleration. The gradient is computed by calculating projection of the solution and its error backprojection at each iterative step. We study the quality of the solution, as a function of the regularization parameters and noise, on synthetic data for which the solution is a-priori known. We apply the method on experimental data in the case of Differential Phase Tomography. For this case we use an original approach which consists in using vectorial patches, each patch having two components: one per each gradient component. The resulting algorithm, implemented in the European Synchrotron Radiation Facility tomography reconstruction code PyHST, has proven to be efficient and well-adapted to strongly reduce the required dose and the number of projections in medical tomography.

  8. Computation of Cosmic Ray Ionization and Dose at Mars: a Comparison of HZETRN and Planetocosmics for Proton and Alpha Particles

    NASA Technical Reports Server (NTRS)

    Gronoff, Guillaume; Norman, Ryan B.; Mertens, Christopher J.

    2014-01-01

    The ability to evaluate the cosmic ray environment at Mars is of interest for future manned exploration. To support exploration, tools must be developed to accurately access the radiation environment in both free space and on planetary surfaces. The primary tool NASA uses to quantify radiation exposure behind shielding materials is the space radiation transport code, HZETRN. In order to build confidence in HZETRN, code benchmarking against Monte Carlo radiation transport codes is often used. This work compares the dose calculations at Mars by HZETRN and the Geant4 application Planetocosmics. The dose at ground and the energy deposited in the atmosphere by galactic cosmic ray protons and alpha particles has been calculated for the Curiosity landing conditions. In addition, this work has considered Solar Energetic Particle events, allowing for the comparison of varying input radiation environments. The results for protons and alpha particles show very good agreement between HZETRN and Planetocosmics.

  9. Neutron dose rate analysis on HTGR-10 reactor using Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Suwoto; Adrial, H.; Hamzah, A.; Zuhair; Bakhri, S.; Sunaryo, G. R.

    2018-02-01

    The HTGR-10 reactor is cylinder-shaped core fuelled with kernel TRISO coated fuel particles in the spherical pebble with helium cooling system. The outlet helium gas coolant temperature outputted from the reactor core is designed to 700 °C. One advantage HTGR type reactor is capable of co-generation, as an addition to generating electricity, the reactor was designed to produce heat at high temperature can be used for other processes. The spherical fuel pebble contains 8335 TRISO UO2 kernel coated particles with enrichment of 10% and 17% are dispersed in a graphite matrix. The main purpose of this study was to analysis the distribution of neutron dose rates generated from HTGR-10 reactors. The calculation and analysis result of neutron dose rate in the HTGR-10 reactor core was performed using Monte Carlo MCNP5v1.6 code. The problems of double heterogeneity in kernel fuel coated particles TRISO and spherical fuel pebble in the HTGR-10 core are modelled well with MCNP5v1.6 code. The neutron flux to dose conversion factors taken from the International Commission on Radiological Protection (ICRP-74) was used to determine the dose rate that passes through the active core, reflectors, core barrel, reactor pressure vessel (RPV) and a biological shield. The calculated results of neutron dose rate with MCNP5v1.6 code using a conversion factor of ICRP-74 (2009) for radiation workers in the radial direction on the outside of the RPV (radial position = 220 cm from the center of the patio HTGR-10) provides the respective value of 9.22E-4 μSv/h and 9.58E-4 μSv/h for enrichment 10% and 17%, respectively. The calculated values of neutron dose rates are compliant with BAPETEN Chairman’s Regulation Number 4 Year 2013 on Radiation Protection and Safety in Nuclear Energy Utilization which sets the limit value for the average effective dose for radiation workers 20 mSv/year or 10μSv/h. Thus the protection and safety for radiation workers to be safe from the radiation source has been fulfilled. From the result analysis, it can be concluded that the model of calculation result of neutron dose rate for HTGR-10 core has met the required radiation safety standards.

  10. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  11. Using the Monte Carlo method for assessing the tissue and organ doses of patients in dental radiography

    NASA Astrophysics Data System (ADS)

    Makarevich, K. O.; Minenko, V. F.; Verenich, K. A.; Kuten, S. A.

    2016-05-01

    This work is dedicated to modeling dental radiographic examinations to assess the absorbed doses of patients and effective doses. For simulating X-ray spectra, the TASMIP empirical model is used. Doses are assessed on the basis of the Monte Carlo method by using MCNP code for voxel phantoms of ICRP. The results of the assessment of doses to individual organs and effective doses for different types of dental examinations and features of X-ray tube are presented.

  12. The QDREC web server: determining dose-response characteristics of complex macroparasites in phenotypic drug screens.

    PubMed

    Asarnow, Daniel; Rojo-Arreola, Liliana; Suzuki, Brian M; Caffrey, Conor R; Singh, Rahul

    2015-05-01

    Neglected tropical diseases (NTDs) caused by helminths constitute some of the most common infections of the world's poorest people. The etiological agents are complex and recalcitrant to standard techniques of molecular biology. Drug screening against helminths has often been phenotypic and typically involves manual description of drug effect and efficacy. A key challenge is to develop automated, quantitative approaches to drug screening against helminth diseases. The quantal dose-response calculator (QDREC) constitutes a significant step in this direction. It can be used to automatically determine quantitative dose-response characteristics and half-maximal effective concentration (EC50) values using image-based readouts from phenotypic screens, thereby allowing rigorous comparisons of the efficacies of drug compounds. QDREC has been developed and validated in the context of drug screening for schistosomiasis, one of the most important NTDs. However, it is equally applicable to general phenotypic screening involving helminths and other complex parasites. QDREC is publically available at: http://haddock4.sfsu.edu/qdrec2/. Source code and datasets are at: http://tintin.sfsu.edu/projects/phenotypicAssays.html. rahul@sfsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. From pandemic preparedness to biofuel production: Tobacco finds its biotechnology niche in North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Joshua D.

    As part of my NSD Innovation awarded funds (95470 Powell Innovation: charge code N38540) one my deliverables was a review article for journal submission summarizing my work on this project. My NSD Innovation project is expressing Ebola antibodies in tobacco plants. I've attached abstract below Title: From pandemic preparedness to biofuel production: tobacco finds its biotechnology niche in North America Abstract: Abstract: In 2012 scientists funded by the U.S. Defense Advanced Research Projects Agency (DARPA) produced 10 million doses of influenza vaccine in tobacco in a milestone deadline of one month. Recently the experimental antibody cocktail Zmapp™, also produced inmore » tobacco, has shown promise as an emergency intervention therapeutic against Ebola. These two examples showcase how collaborative efforts between government, private industry and academia are applying plant biotechnology to combat pathogenic agents. Opportunities now exist repurposing tobacco expression systems for exciting new applications in synthetic biology, biofuels production and industrial enzyme production. Lastly, as plant-produced biotherapeutics become more mainstream, government funding agencies need to be cognizant of the idea that many plant-produced biologicals are often safer, cheaper and just as efficacious as their counterparts that are produced using traditional expression systems.« less

  14. From pandemic preparedness to biofuel production: Tobacco finds its biotechnology niche in North America

    DOE PAGES

    Powell, Joshua D.

    2015-09-25

    As part of my NSD Innovation awarded funds (95470 Powell Innovation: charge code N38540) one my deliverables was a review article for journal submission summarizing my work on this project. My NSD Innovation project is expressing Ebola antibodies in tobacco plants. I've attached abstract below Title: From pandemic preparedness to biofuel production: tobacco finds its biotechnology niche in North America Abstract: Abstract: In 2012 scientists funded by the U.S. Defense Advanced Research Projects Agency (DARPA) produced 10 million doses of influenza vaccine in tobacco in a milestone deadline of one month. Recently the experimental antibody cocktail Zmapp™, also produced inmore » tobacco, has shown promise as an emergency intervention therapeutic against Ebola. These two examples showcase how collaborative efforts between government, private industry and academia are applying plant biotechnology to combat pathogenic agents. Opportunities now exist repurposing tobacco expression systems for exciting new applications in synthetic biology, biofuels production and industrial enzyme production. Lastly, as plant-produced biotherapeutics become more mainstream, government funding agencies need to be cognizant of the idea that many plant-produced biologicals are often safer, cheaper and just as efficacious as their counterparts that are produced using traditional expression systems.« less

  15. Comparison of IPSM 1990 photon dosimetry code of practice with IAEA TRS‐398 and AAPM TG‐51.

    PubMed Central

    Henríquez, Francisco Cutanda

    2009-01-01

    Several codes of practice for photon dosimetry are currently used around the world, supported by different organizations. A comparison of IPSM 1990 with both IAEA TRS‐398 and AAPM TG‐51 has been performed. All three protocols are based on the calibration of ionization chambers in terms of standards of absorbed dose to water, as it is the case with other modern codes of practice. This comparison has been carried out for photon beams of nominal energies: 4 MV, 6 MV, 8 MV, 10 MV and 18 MV. An NE 2571 graphite ionization chamber was used in this study, cross‐calibrated against an NE 2611A Secondary Standard, calibrated in the National Physical Laboratory (NPL). Absolute dose in reference conditions was obtained using each of these three protocols including: beam quality indices, beam quality conversion factors both theoretical and NPL experimental ones, correction factors for influence quantities and absolute dose measurements. Each protocol recommendations have been strictly followed. Uncertainties have been obtained according to the ISO Guide to the Expression of Uncertainty in Measurement. Absorbed dose obtained according to all three protocols agree within experimental uncertainty. The largest difference between absolute dose results for two protocols is obtained for the highest energy: 0.7% between IPSM 1990 and IAEA TRS‐398 using theoretical beam quality conversion factors. PACS number: 87.55.tm

  16. Space radiation dose estimates on the surface of Mars

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  17. An investigation of voxel geometries for MCNP-based radiation dose calculations.

    PubMed

    Zhang, Juying; Bednarz, Bryan; Xu, X George

    2006-11-01

    Voxelized geometry such as those obtained from medical images is increasingly used in Monte Carlo calculations of absorbed doses. One useful application of calculated absorbed dose is the determination of fluence-to-dose conversion factors for different organs. However, confusion still exists about how such a geometry is defined and how the energy deposition is best computed, especially involving a popular code, MCNP5. This study investigated two different types of geometries in the MCNP5 code, cell and lattice definitions. A 10 cm x 10 cm x 10 cm test phantom, which contained an embedded 2 cm x 2 cm x 2 cm target at its center, was considered. A planar source emitting parallel photons was also considered in the study. The results revealed that MCNP5 does not calculate total target volume for multi-voxel geometries. Therefore, tallies which involve total target volume must be divided by the user by the total number of voxels to obtain a correct dose result. Also, using planar source areas greater than the phantom size results in the same fluence-to-dose conversion factor.

  18. A test of the IAEA code of practice for absorbed dose determination in photon and electron beams

    NASA Astrophysics Data System (ADS)

    Leitner, Arnold; Tiefenboeck, Wilhelm; Witzani, Josef; Strachotinsky, Christian

    1990-12-01

    The IAEA (International Atomic Energy Agency) code of practice TRS 277 gives recommendations for absorbed dose determination in high energy photon and electron beams based on the use of ionization chambers calibrated in terms of exposure of air kerma. The scope of the work was to test the code for cobalt 60 gamma radiation and for several radiation qualities at four different types of electron accelerators and to compare the ionization chamber dosimetry with ferrous sulphate dosimetry. The results show agreement between the two methods within about one per cent for all the investigated qualities. In addition the response of the TLD capsules of the IAEA/WHO TL dosimetry service was determined.

  19. Effect of Localizer Radiography Projection on Organ Dose at Chest CT with Automatic Tube Current Modulation.

    PubMed

    Saltybaeva, Natalia; Krauss, Andreas; Alkadhi, Hatem

    2017-03-01

    Purpose To calculate the effect of localizer radiography projections to the total radiation dose, including both the dose from localizer radiography and that from subsequent chest computed tomography (CT) with tube current modulation (TCM). Materials and Methods An anthropomorphic phantom was scanned with 192-section CT without and with differently sized breast attachments. Chest CT with TCM was performed after one localizer radiographic examination with anteroposterior (AP) or posteroanterior (PA) projections. Dose distributions were obtained by means of Monte Carlo simulations based on acquired CT data. For Monte Carlo simulations of localizer radiography, the tube position was fixed at 0° and 180°; for chest CT, a spiral trajectory with TCM was used. The effect of tube start angles on dose distribution was investigated with Monte Carlo simulations by using TCM curves with fixed start angles (0°, 90°, and 180°). Total doses for lungs, heart, and breast were calculated as the sum of the dose from localizer radiography and CT. Image noise was defined as the standard deviation of attenuation measured in 14 circular regions of interest. The Wilcoxon signed rank test, paired t test, and Friedman analysis of variance were conducted to evaluate differences in noise, TCM curves, and organ doses, respectively. Results Organ doses from localizer radiography were lower when using a PA instead of an AP projection (P = .005). The use of a PA projection resulted in higher TCM values for chest CT (P < .001) owing to the higher attenuation (P < .001) and thus resulted in higher total organ doses for all investigated phantoms and protocols (P < .001). Noise in CT images was lower with PA localizer radiography than with AP localizer radiography (P = .03). The use of an AP projection allowed for total dose reductions of 16%, 15%, and 12% for lungs, breast, and heart, respectively. Differences in organ doses were not related to tube start angles (P = .17). Conclusion The total organ doses are higher when using PA projection localizer radiography owing to higher TCM values, whereas the organ doses from PA localizer radiography alone are lower. Thus, PA localizer radiography should be used in combination with reduced reference tube current at subsequent chest CT. © RSNA, 2016 Online supplemental material is available for this article.

  20. Neutron irradiation and damage assessment of plastic scintillators of the Tile Calorimeter

    NASA Astrophysics Data System (ADS)

    Mdhluli, J. E.; Mellado, B.; Sideras-Haddad, E.

    2017-01-01

    Following the comparative study of proton induced radiation damage on various plastic scintillator samples from the ATLAS-CERN detector, a study on neutron irradiation and damage assessment on the same type of samples will be conducted. The samples will be irradiated with different dose rates of neutrons produced in favourable nuclear reactions using a radiofrequency linear particle accelerator as well as from the SAFARI nuclear reactor at NECSA. The MCNP 5 code will be utilized in simulating the neutron transport for determining the dose rate. Light transmission and light yield tests will be performed in order to assess the radiation damage on the scintillators. In addition, Raman spectroscopy and Electron Paramagnetic Resonance (EPR) analysis will be used to characterize the samples after irradiation. The project aims to extent these studies to include radiation assessment damage of any component that processes the scintillating light and deteriorates the quantum efficiency of the Tilecal detector, namely, photomultiplier tubes, wavelength shifting optical fibres and the readout electronics. They will also be exposed to neutron irradiation and the damage assessed in the same manner.

  1. TH-A-18C-03: Noise Correlation in CBCT Projection Data and Its Application for Noise Reduction in Low-Dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ZHANG, H; Huang, J; Ma, J

    2014-06-15

    Purpose: To study the noise correlation properties of cone-beam CT (CBCT) projection data and to incorporate the noise correlation information to a statistics-based projection restoration algorithm for noise reduction in low-dose CBCT. Methods: In this study, we systematically investigated the noise correlation properties among detector bins of CBCT projection data by analyzing repeated projection measurements. The measurements were performed on a TrueBeam on-board CBCT imaging system with a 4030CB flat panel detector. An anthropomorphic male pelvis phantom was used to acquire 500 repeated projection data at six different dose levels from 0.1 mAs to 1.6 mAs per projection at threemore » fixed angles. To minimize the influence of the lag effect, lag correction was performed on the consecutively acquired projection data. The noise correlation coefficient between detector bin pairs was calculated from the corrected projection data. The noise correlation among CBCT projection data was then incorporated into the covariance matrix of the penalized weighted least-squares (PWLS) criterion for noise reduction of low-dose CBCT. Results: The analyses of the repeated measurements show that noise correlation coefficients are non-zero between the nearest neighboring bins of CBCT projection data. The average noise correlation coefficients for the first- and second- order neighbors are about 0.20 and 0.06, respectively. The noise correlation coefficients are independent of the dose level. Reconstruction of the pelvis phantom shows that the PWLS criterion with consideration of noise correlation (PWLS-Cor) results in a lower noise level as compared to the PWLS criterion without considering the noise correlation (PWLS-Dia) at the matched resolution. Conclusion: Noise is correlated among nearest neighboring detector bins of CBCT projection data. An accurate noise model of CBCT projection data can improve the performance of the statistics-based projection restoration algorithm for low-dose CBCT.« less

  2. Fluence-to-dose conversion coefficients for neutrons and protons calculated using the PHITS code and ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Zankl, Maria; Petoussi-Henss, Nina; Niita, Koji

    2009-04-07

    The fluence to organ-dose and effective-dose conversion coefficients for neutrons and protons with energies up to 100 GeV was calculated using the PHITS code coupled to male and female adult reference computational phantoms, which are to be released as a common ICRP/ICRU publication. For the calculation, the radiation and tissue weighting factors, w(R) and w(T), respectively, as revised in ICRP Publication 103 were employed. The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of the absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. By comparing these data with the corresponding data for the effective dose, we found that the numerical compatibilities of the revised w(R) with the Q(L) and Q(y) relationships are fairly established. The calculated data of these dose conversion coefficients are indispensable for constructing the radiation protection systems based on the new recommendations given in ICRP103 for aircrews and astronauts, as well as for workers in accelerators and nuclear facilities.

  3. SU-F-18C-09: Assessment of OSL Dosimeter Technology in the Validation of a Monte Carlo Radiation Transport Code for CT Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, D; Kost, S; Pickens, D

    Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width ofmore » 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.« less

  4. Method for the prediction of the effective dose equivalent to the crew of the International Space Station

    NASA Astrophysics Data System (ADS)

    El-Jaby, Samy; Tomi, Leena; Sihver, Lembit; Sato, Tatsuhiko; Richardson, Richard B.; Lewis, Brent J.

    2014-03-01

    This paper describes a methodology for assessing the pre-mission exposure of space crew aboard the International Space Station (ISS) in terms of an effective dose equivalent. In this approach, the PHITS Monte Carlo code was used to assess the particle transport of galactic cosmic radiation (GCR) and trapped radiation for solar maximum and minimum conditions through an aluminum shield thickness. From these predicted spectra, and using fluence-to-dose conversion factors, a scaling ratio of the effective dose equivalent rate to the ICRU ambient dose equivalent rate at a 10 mm depth was determined. Only contributions from secondary neutrons, protons, and alpha particles were considered in this analysis. Measurements made with a tissue equivalent proportional counter (TEPC) located at Service Module panel 327, as captured through a semi-empirical correlation in the ISSCREM code, where then scaled using this conversion factor for prediction of the effective dose equivalent. This analysis shows that at this location within the service module, the total effective dose equivalent is 10-30% less than the total TEPC dose equivalent. Approximately 75-85% of the effective dose equivalent is derived from the GCR. This methodology provides an opportunity for pre-flight predictions of the effective dose equivalent and therefore offers a means to assess the health risks of radiation exposure on ISS flight crew.

  5. Fluence-to-dose conversion coefficients for heavy ions calculated using the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Niita, Koji

    2010-04-21

    The fluence to organ-absorbed-dose and effective-dose conversion coefficients for heavy ions with atomic numbers up to 28 and energies from 1 MeV/nucleon to 100 GeV/nucleon were calculated using the PHITS code coupled to the ICRP/ICRU adult reference computational phantoms, following the instruction given in ICRP Publication 103 (2007 (Oxford: Pergamon)). The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. The calculation results indicate that the effective dose can generally give a conservative estimation of the effective dose equivalent for heavy-ion exposure, although it is occasionally too conservative especially for high-energy lighter-ion irradiations. It is also found from the calculation that the conversion coefficients for the Q(y)-based effective dose equivalents are generally smaller than the corresponding Q(L)-based values because of the conceptual difference between LET and y as well as the numerical incompatibility between the Q(L) and Q(y) relationships. The calculated data of these dose conversion coefficients are very useful for the dose estimation of astronauts due to cosmic-ray exposure.

  6. Evaluation and comparison of absorbed dose for electron beams by LiF and diamond dosimeters

    NASA Astrophysics Data System (ADS)

    Mosia, G. J.; Chamberlain, A. C.

    2007-09-01

    The absorbed dose response of LiF and diamond thermoluminescent dosimeters (TLDs), calibrated in 60Co γ-rays, has been determined using the MCNP4B Monte Carlo code system in mono-energetic megavoltage electron beams from 5 to 20 MeV. Evaluation of the dose responses was done against the dose responses of published works by other investigators. Dose responses of both dosimeters were compared to establish if any relation exists between them. The dosimeters were irradiated in a water phantom with the centre of their top surfaces (0.32×0.32 cm 2), placed at dmax perpendicular to the radiation beam on the central axis. For LiF TLD, dose responses ranged from 0.945±0.017 to 0.997±0.011. For the diamond TLD, the dose response ranged from 0.940±0.017 to 1.018±0.011. To correct for dose responses by both dosimeters, energy correction factors were generated from dose response results of both TLDs. For LiF TLD, these correction factors ranged from 1.003 up to 1.058 and for diamond TLD the factors ranged from 0.982 up to 1.064. The results show that diamond TLDs can be used in the place of the well-established LiF TLDs and that Monte Carlo code systems can be used in dose determinations for radiotherapy treatment planning.

  7. Monte Carlo N Particle code - Dose distribution of clinical electron beams in inhomogeneous phantoms

    PubMed Central

    Nedaie, H. A.; Mosleh-Shirazi, M. A.; Allahverdi, M.

    2013-01-01

    Electron dose distributions calculated using the currently available analytical methods can be associated with large uncertainties. The Monte Carlo method is the most accurate method for dose calculation in electron beams. Most of the clinical electron beam simulation studies have been performed using non- MCNP [Monte Carlo N Particle] codes. Given the differences between Monte Carlo codes, this work aims to evaluate the accuracy of MCNP4C-simulated electron dose distributions in a homogenous phantom and around inhomogeneities. Different types of phantoms ranging in complexity were used; namely, a homogeneous water phantom and phantoms made of polymethyl methacrylate slabs containing different-sized, low- and high-density inserts of heterogeneous materials. Electron beams with 8 and 15 MeV nominal energy generated by an Elekta Synergy linear accelerator were investigated. Measurements were performed for a 10 cm × 10 cm applicator at a source-to-surface distance of 100 cm. Individual parts of the beam-defining system were introduced into the simulation one at a time in order to show their effect on depth doses. In contrast to the first scattering foil, the secondary scattering foil, X and Y jaws and applicator provide up to 5% of the dose. A 2%/2 mm agreement between MCNP and measurements was found in the homogenous phantom, and in the presence of heterogeneities in the range of 1-3%, being generally within 2% of the measurements for both energies in a "complex" phantom. A full-component simulation is necessary in order to obtain a realistic model of the beam. The MCNP4C results agree well with the measured electron dose distributions. PMID:23533162

  8. Analysis of localised dose distribution in human body by Monte Carlo code system for photon irradiation.

    PubMed

    Ohnishi, S; Odano, N; Nariyama, N; Saito, K

    2004-01-01

    In usual personal dosimetry, whole body irradiation is assumed. However, the opportunity of partial irradiation is increasing and the tendencies of protection quantities caused under those irradiation conditions are different. The code system has been developed and effective dose and organ absorbed doses have been calculated in the case of horizontal narrow photon beam irradiated from various directions at three representative body sections, 40, 50 and 60 cm originating from the top of the head. This work covers 24 beam directions, each 15 degrees angle ranging from 0 degrees to 345 degrees, three energy levels, 45 keV, 90 keV and 1.25 MeV, and three beam diameters of 1, 2 and 4 cm. These results show that the beam injected from diagonally front or other specific direction causes peak dose in the case of partial irradiation.

  9. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE PAGES

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.; ...

    2016-12-01

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  10. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  11. ARCHERRT – A GPU-based and photon-electron coupled Monte Carlo dose computing engine for radiation therapy: Software development and application to helical tomotherapy

    PubMed Central

    Su, Lin; Yang, Youming; Bednarz, Bryan; Sterpin, Edmond; Du, Xining; Liu, Tianyu; Ji, Wei; Xu, X. George

    2014-01-01

    Purpose: Using the graphical processing units (GPU) hardware technology, an extremely fast Monte Carlo (MC) code ARCHERRT is developed for radiation dose calculations in radiation therapy. This paper describes the detailed software development and testing for three clinical TomoTherapy® cases: the prostate, lung, and head & neck. Methods: To obtain clinically relevant dose distributions, phase space files (PSFs) created from optimized radiation therapy treatment plan fluence maps were used as the input to ARCHERRT. Patient-specific phantoms were constructed from patient CT images. Batch simulations were employed to facilitate the time-consuming task of loading large PSFs, and to improve the estimation of statistical uncertainty. Furthermore, two different Woodcock tracking algorithms were implemented and their relative performance was compared. The dose curves of an Elekta accelerator PSF incident on a homogeneous water phantom were benchmarked against DOSXYZnrc. For each of the treatment cases, dose volume histograms and isodose maps were produced from ARCHERRT and the general-purpose code, GEANT4. The gamma index analysis was performed to evaluate the similarity of voxel doses obtained from these two codes. The hardware accelerators used in this study are one NVIDIA K20 GPU, one NVIDIA K40 GPU, and six NVIDIA M2090 GPUs. In addition, to make a fairer comparison of the CPU and GPU performance, a multithreaded CPU code was developed using OpenMP and tested on an Intel E5-2620 CPU. Results: For the water phantom, the depth dose curve and dose profiles from ARCHERRT agree well with DOSXYZnrc. For clinical cases, results from ARCHERRT are compared with those from GEANT4 and good agreement is observed. Gamma index test is performed for voxels whose dose is greater than 10% of maximum dose. For 2%/2mm criteria, the passing rates for the prostate, lung case, and head & neck cases are 99.7%, 98.5%, and 97.2%, respectively. Due to specific architecture of GPU, modified Woodcock tracking algorithm performed inferior to the original one. ARCHERRT achieves a fast speed for PSF-based dose calculations. With a single M2090 card, the simulations cost about 60, 50, 80 s for three cases, respectively, with the 1% statistical error in the PTV. Using the latest K40 card, the simulations are 1.7–1.8 times faster. More impressively, six M2090 cards could finish the simulations in 8.9–13.4 s. For comparison, the same simulations on Intel E5-2620 (12 hyperthreading) cost about 500–800 s. Conclusions: ARCHERRT was developed successfully to perform fast and accurate MC dose calculation for radiotherapy using PSFs and patient CT phantoms. PMID:24989378

  12. ARCHERRT - a GPU-based and photon-electron coupled Monte Carlo dose computing engine for radiation therapy: software development and application to helical tomotherapy.

    PubMed

    Su, Lin; Yang, Youming; Bednarz, Bryan; Sterpin, Edmond; Du, Xining; Liu, Tianyu; Ji, Wei; Xu, X George

    2014-07-01

    Using the graphical processing units (GPU) hardware technology, an extremely fast Monte Carlo (MC) code ARCHERRT is developed for radiation dose calculations in radiation therapy. This paper describes the detailed software development and testing for three clinical TomoTherapy® cases: the prostate, lung, and head & neck. To obtain clinically relevant dose distributions, phase space files (PSFs) created from optimized radiation therapy treatment plan fluence maps were used as the input to ARCHERRT. Patient-specific phantoms were constructed from patient CT images. Batch simulations were employed to facilitate the time-consuming task of loading large PSFs, and to improve the estimation of statistical uncertainty. Furthermore, two different Woodcock tracking algorithms were implemented and their relative performance was compared. The dose curves of an Elekta accelerator PSF incident on a homogeneous water phantom were benchmarked against DOSXYZnrc. For each of the treatment cases, dose volume histograms and isodose maps were produced from ARCHERRT and the general-purpose code, GEANT4. The gamma index analysis was performed to evaluate the similarity of voxel doses obtained from these two codes. The hardware accelerators used in this study are one NVIDIA K20 GPU, one NVIDIA K40 GPU, and six NVIDIA M2090 GPUs. In addition, to make a fairer comparison of the CPU and GPU performance, a multithreaded CPU code was developed using OpenMP and tested on an Intel E5-2620 CPU. For the water phantom, the depth dose curve and dose profiles from ARCHERRT agree well with DOSXYZnrc. For clinical cases, results from ARCHERRT are compared with those from GEANT4 and good agreement is observed. Gamma index test is performed for voxels whose dose is greater than 10% of maximum dose. For 2%/2mm criteria, the passing rates for the prostate, lung case, and head & neck cases are 99.7%, 98.5%, and 97.2%, respectively. Due to specific architecture of GPU, modified Woodcock tracking algorithm performed inferior to the original one. ARCHERRT achieves a fast speed for PSF-based dose calculations. With a single M2090 card, the simulations cost about 60, 50, 80 s for three cases, respectively, with the 1% statistical error in the PTV. Using the latest K40 card, the simulations are 1.7-1.8 times faster. More impressively, six M2090 cards could finish the simulations in 8.9-13.4 s. For comparison, the same simulations on Intel E5-2620 (12 hyperthreading) cost about 500-800 s. ARCHERRT was developed successfully to perform fast and accurate MC dose calculation for radiotherapy using PSFs and patient CT phantoms.

  13. From Earth to Mars, Radiation Intensities in Interplanetary Space

    NASA Astrophysics Data System (ADS)

    O'Brien, Keran

    2007-10-01

    The radiation field in interplanetary space between Earth and Mars is rather intense. Using a modified version of the ATROPOS Monte Carlo code combined with a modified version of the deterministic code, PLOTINUS, the effective dose rate to crew members in space craft hull shielded with a shell of 2 g/cm^2 of aluminum and 20 g/cm^2 of polyethylene was calculated to be 51 rem/y. The total dose during the solar-particle event of September 29, 1989, GLE 42, was calculated to be 50 rem. The dose in a ``storm cellar'' of 100 g/cm^2 of polyethylene equivalent during this time was calculated to be 5 rem. The calculations were for conditions corresponding to a recent solar minimum.

  14. Hanford Environmental Dose Reconstruction Project Monthly Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finch, S.M.

    1991-02-01

    The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon and Washington, cultural and technical experts nominated by the regional Native American tribes, and an individualmore » representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; and environmental pathways and dose estimates. Project reports and references used in the reports are made available to the public in a public reading room. Project progress is documented in this monthly report, which is available to the public. 3 figs., 3 tabs.« less

  15. The internal dosimetry code PLEIADES.

    PubMed

    Fell, T P; Phipps, A W; Smith, T J

    2007-01-01

    The International Commission on Radiological Protection (ICRP) has published dose coefficients for the ingestion or inhalation of radionuclides in a series of reports covering intakes by workers and members of the public, including children and pregnant or lactating women. The calculation of these coefficients divides naturally into two distinct parts-the biokinetic and dosimetric. This paper describes in detail the methods used to solve the biokinetic problem in the generation of dose coefficients on behalf of the ICRP, as implemented in the Health Protection Agency's internal dosimetry code PLEIADES. A summary of the dosimetric treatment is included.

  16. SU-F-T-12: Monte Carlo Dosimetry of the 60Co Bebig High Dose Rate Source for Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, L T; Almeida, C E V de

    Purpose: The purpose of this work is to obtain the dosimetry parameters in accordance with the AAPM TG-43U1 formalism with Monte Carlo calculations regarding the BEBIG 60Co high-dose-rate brachytherapy. The geometric design and material details of the source was provided by the manufacturer and was used to define the Monte Carlo geometry. Methods: The dosimetry studies included the calculation of the air kerma strength Sk, collision kerma in water along the transverse axis with an unbounded phantom, dose rate constant and radial dose function. The Monte Carlo code system that was used was EGSnrc with a new cavity code, whichmore » is a part of EGS++ that allows calculating the radial dose function around the source. The XCOM photon cross-section library was used. Variance reduction techniques were used to speed up the calculation and to considerably reduce the computer time. To obtain the dose rate distributions of the source in an unbounded liquid water phantom, the source was immersed at the center of a cube phantom of 100 cm3. Results: The obtained dose rate constant for the BEBIG 60Co source was 1.108±0.001 cGyh-1U-1, which is consistent with the values in the literature. The radial dose functions were compared with the values of the consensus data set in the literature, and they are consistent with the published data for this energy range. Conclusion: The dose rate constant is consistent with the results of Granero et al. and Selvam and Bhola within 1%. Dose rate data are compared to GEANT4 and DORZnrc Monte Carlo code. However, the radial dose function is different by up to 10% for the points that are notably near the source on the transversal axis because of the high-energy photons from 60Co, which causes an electronic disequilibrium at the interface between the source capsule and the liquid water for distances up to 1 cm.« less

  17. 12 CFR 1807.503 - Project completion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...

  18. 12 CFR 1807.503 - Project completion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...

  19. 12 CFR 1807.503 - Project completion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...

  20. 12 CFR 1807.503 - Project completion.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...

  1. Low dose CT reconstruction via L1 norm dictionary learning using alternating minimization algorithm and balancing principle.

    PubMed

    Wu, Junfeng; Dai, Fang; Hu, Gang; Mou, Xuanqin

    2018-04-18

    Excessive radiation exposure in computed tomography (CT) scans increases the chance of developing cancer and has become a major clinical concern. Recently, statistical iterative reconstruction (SIR) with l0-norm dictionary learning regularization has been developed to reconstruct CT images from the low dose and few-view dataset in order to reduce radiation dose. Nonetheless, the sparse regularization term adopted in this approach is l0-norm, which cannot guarantee the global convergence of the proposed algorithm. To address this problem, in this study we introduced the l1-norm dictionary learning penalty into SIR framework for low dose CT image reconstruction, and developed an alternating minimization algorithm to minimize the associated objective function, which transforms CT image reconstruction problem into a sparse coding subproblem and an image updating subproblem. During the image updating process, an efficient model function approach based on balancing principle is applied to choose the regularization parameters. The proposed alternating minimization algorithm was evaluated first using real projection data of a sheep lung CT perfusion and then using numerical simulation based on sheep lung CT image and chest image. Both visual assessment and quantitative comparison using terms of root mean square error (RMSE) and structural similarity (SSIM) index demonstrated that the new image reconstruction algorithm yielded similar performance with l0-norm dictionary learning penalty and outperformed the conventional filtered backprojection (FBP) and total variation (TV) minimization algorithms.

  2. The weight hierarchies and chain condition of a class of codes from varieties over finite fields

    NASA Technical Reports Server (NTRS)

    Wu, Xinen; Feng, Gui-Liang; Rao, T. R. N.

    1996-01-01

    The generalized Hamming weights of linear codes were first introduced by Wei. These are fundamental parameters related to the minimal overlap structures of the subcodes and very useful in several fields. It was found that the chain condition of a linear code is convenient in studying the generalized Hamming weights of the product codes. In this paper we consider a class of codes defined over some varieties in projective spaces over finite fields, whose generalized Hamming weights can be determined by studying the orbits of subspaces of the projective spaces under the actions of classical groups over finite fields, i.e., the symplectic groups, the unitary groups and orthogonal groups. We give the weight hierarchies and generalized weight spectra of the codes from Hermitian varieties and prove that the codes satisfy the chain condition.

  3. Leadership Class Configuration Interaction Code - Status and Opportunities

    NASA Astrophysics Data System (ADS)

    Vary, James

    2011-10-01

    With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).

  4. Evaluation of low-dose limits in 3D-2D rigid registration for surgical guidance

    NASA Astrophysics Data System (ADS)

    Uneri, A.; Wang, A. S.; Otake, Y.; Kleinszig, G.; Vogt, S.; Khanna, A. J.; Gallia, G. L.; Gokaslan, Z. L.; Siewerdsen, J. H.

    2014-09-01

    An algorithm for intensity-based 3D-2D registration of CT and C-arm fluoroscopy is evaluated for use in surgical guidance, specifically considering the low-dose limits of the fluoroscopic x-ray projections. The registration method is based on a framework using the covariance matrix adaptation evolution strategy (CMA-ES) to identify the 3D patient pose that maximizes the gradient information similarity metric. Registration performance was evaluated in an anthropomorphic head phantom emulating intracranial neurosurgery, using target registration error (TRE) to characterize accuracy and robustness in terms of 95% confidence upper bound in comparison to that of an infrared surgical tracking system. Three clinical scenarios were considered: (1) single-view image + guidance, wherein a single x-ray projection is used for visualization and 3D-2D guidance; (2) dual-view image + guidance, wherein one projection is acquired for visualization, combined with a second (lower-dose) projection acquired at a different C-arm angle for 3D-2D guidance; and (3) dual-view guidance, wherein both projections are acquired at low dose for the purpose of 3D-2D guidance alone (not visualization). In each case, registration accuracy was evaluated as a function of the entrance surface dose associated with the projection view(s). Results indicate that images acquired at a dose as low as 4 μGy (approximately one-tenth the dose of a typical fluoroscopic frame) were sufficient to provide TRE comparable or superior to that of conventional surgical tracking, allowing 3D-2D guidance at a level of dose that is at most 10% greater than conventional fluoroscopy (scenario #2) and potentially reducing the dose to approximately 20% of the level in a conventional fluoroscopically guided procedure (scenario #3).

  5. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING AND CODING VERIFICATION (HAND ENTRY) (UA-D-14.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...

  6. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; hide

    2009-01-01

    The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.

  8. GRAYSKY-A new gamma-ray skyshine code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witts, D.J.; Twardowski, T.; Watmough, M.H.

    1993-01-01

    This paper describes a new prototype gamma-ray skyshine code GRAYSKY (Gamma-RAY SKYshine) that has been developed at BNFL, as part of an industrially based master of science course, to overcome the problems encountered with SKYSHINEII and RANKERN. GRAYSKY is a point kernel code based on the use of a skyshine response function. The scattering within source or shield materials is accounted for by the use of buildup factors. This is an approximate method of solution but one that has been shown to produce results that are acceptable for dose rate predictions on operating plants. The novel features of GRAYSKY aremore » as follows: 1. The code is fully integrated with a semianalytical point kernel shielding code, currently under development at BNFL, which offers powerful solid-body modeling capabilities. 2. The geometry modeling also allows the skyshine response function to be used in a manner that accounts for the shielding of air-scattered radiation. 3. Skyshine buildup factors calculated using the skyshine response function have been used as well as dose buildup factors.« less

  9. Space Radiation Organ Doses for Astronauts on Past and Future Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2007-01-01

    We review methods and data used for determining astronaut organ dose equivalents on past space missions including Apollo, Skylab, Space Shuttle, NASA-Mir, and International Space Station (ISS). Expectations for future lunar missions are also described. Physical measurements of space radiation include the absorbed dose, dose equivalent, and linear energy transfer (LET) spectra, or a related quantity, the lineal energy (y) spectra that is measured by a tissue equivalent proportional counter (TEPC). These data are used in conjunction with space radiation transport models to project organ specific doses used in cancer and other risk projection models. Biodosimetry data from Mir, STS, and ISS missions provide an alternative estimate of organ dose equivalents based on chromosome aberrations. The physical environments inside spacecraft are currently well understood with errors in organ dose projections estimated as less than plus or minus 15%, however understanding the biological risks from space radiation remains a difficult problem because of the many radiation types including protons, heavy ions, and secondary neutrons for which there are no human data to estimate risks. The accuracy of projections of organ dose equivalents described here must be supplemented with research on the health risks of space exposure to properly assess crew safety for exploration missions.

  10. Analysis and evaluation for consumer goods containing NORM in Korea.

    PubMed

    Jang, Mee; Chung, Kun Ho; Lim, Jong Myoung; Ji, Young Yong; Kim, Chang Jong; Kang, Mun Ja

    2017-08-01

    We analyzed the consumer goods containing NORM by ICP-MS and evaluated the external dose. To evaluate the external dose, we assumed the small room model as irradiation scenario and calculated the specific effective dose rate using MCNPX code. The external doses for twenty goods are less than 1 mSv considering the specific effective dose rates and usage quantities. However, some of them have relatively high dose and the activity concentration limits are necessary as a screening tool. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  12. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study

    NASA Astrophysics Data System (ADS)

    Kotchenova, Svetlana Y.; Vermote, Eric F.; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  13. ROS Hexapod

    NASA Technical Reports Server (NTRS)

    Davis, Kirsch; Bankieris, Derek

    2016-01-01

    As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.

  14. Assessing patient dose in interventional fluoroscopy using patient-dependent hybrid phantoms

    NASA Astrophysics Data System (ADS)

    Johnson, Perry Barnett

    Interventional fluoroscopy uses ionizing radiation to guide small instruments through blood vessels or other body pathways to sites of clinical interest. The technique represents a tremendous advantage over invasive surgical procedures, as it requires only a small incision, thus reducing the risk of infection and providing for shorter recovery times. The growing use and increasing complexity of interventional procedures, however, has resulted in public health concerns regarding radiation exposures, particularly with respect to localized skin dose. Tracking and documenting patient-specific skin and internal organ dose has been specifically identified for interventional fluoroscopy where extended irradiation times, multiple projections, and repeat procedures can lead to some of the largest doses encountered in radiology. Furthermore, inprocedure knowledge of localized skin doses can be of significant clinical importance to managing patient risk and in training radiology residents. In this dissertation, a framework is presented for monitoring the radiation dose delivered to patients undergoing interventional procedures. The framework is built around two key points, developing better anthropomorphic models, and designing clinically relevant software systems for dose estimation. To begin, a library of 50 hybrid patient-dependent computational phantoms was developed based on the UF hybrid male and female reference phantoms. These phantoms represent a different type of anthropomorphic model whereby anthropometric parameters from an individual patient are used during phantom selection. The patient-dependent library was first validated and then used in two patient-phantom matching studies focused on cumulative organ and local skin dose. In terms of organ dose, patient-phantom matching was shown most beneficial for estimating the dose to large patients where error associated with soft tissue attenuation differences could be minimized. For small patients, inherent difference in organ size and location limited the effectiveness of matching. For skin dose, patient-phantom matching was found most beneficial for estimating the dose during lateral and anterior-posterior projections. Patient-sculpting of the patient.s outer body contour was also investigated for use during skin dose estimation and highlighted as a substantial step towards better patient-specificity. In order to utilize the models for actual patient dosimetry, two programs were developed based on the newly released Radiation Dose Structured Report (RDSR). The first program allows for the visualization of skin dose by translating the reference point air kerma to the location of the patient.s skin characterized by a computational model. The program represents an innovative tool that can be used by the interventional physician to modify behavior when clinically appropriate. The second program operates by automatically generating an input file from the RDSR which can then be run within a Monte Carlo based radiation transport code. The program has great potential for initiating and promoting the concept of 'cloud dosimetry', where patient-specific radiation transport is performed off-site and returned via the internet. Both programs are non-proprietary and transferable, and also incorporate the most advanced computational phantoms developed to date. Using the tools developed in this work, there exist a tangible opportunity to improve patient care with the end goal being a better understanding of the risk/benefit relationship that accompanies the medical use of ionizing radiation.

  15. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fasso, A.; Ferrari, A.; Ferrari, A.

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less

  16. MCNP-based computational model for the Leksell gamma knife.

    PubMed

    Trnka, Jiri; Novotny, Josef; Kluson, Jaroslav

    2007-01-01

    We have focused on the usage of MCNP code for calculation of Gamma Knife radiation field parameters with a homogenous polystyrene phantom. We have investigated several parameters of the Leksell Gamma Knife radiation field and compared the results with other studies based on EGS4 and PENELOPE code as well as the Leksell Gamma Knife treatment planning system Leksell GammaPlan (LGP). The current model describes all 201 radiation beams together and simulates all the sources in the same time. Within each beam, it considers the technical construction of the source, the source holder, collimator system, the spherical phantom, and surrounding material. We have calculated output factors for various sizes of scoring volumes, relative dose distributions along basic planes including linear dose profiles, integral doses in various volumes, and differential dose volume histograms. All the parameters have been calculated for each collimator size and for the isocentric configuration of the phantom. We have found the calculated output factors to be in agreement with other authors' works except the case of 4 mm collimator size, where averaging over the scoring volume and statistical uncertainties strongly influences the calculated results. In general, all the results are dependent on the choice of the scoring volume. The calculated linear dose profiles and relative dose distributions also match independent studies and the Leksell GammaPlan, but care must be taken about the fluctuations within the plateau, which can influence the normalization, and accuracy in determining the isocenter position, which is important for comparing different dose profiles. The calculated differential dose volume histograms and integral doses have been compared with data provided by the Leksell GammaPlan. The dose volume histograms are in good agreement as well as integral doses calculated in small calculation matrix volumes. However, deviations in integral doses up to 50% can be observed for large volumes such as for the total skull volume. The differences observed in treatment of scattered radiation between the MC method and the LGP may be important in this case. We have also studied the influence of differential direction sampling of primary photons and have found that, due to the anisotropic sampling, doses around the isocenter deviate from each other by up to 6%. With caution about the details of the calculation settings, it is possible to employ the MCNP Monte Carlo code for independent verification of the Leksell Gamma Knife radiation field properties.

  17. Review of Hybrid (Deterministic/Monte Carlo) Radiation Transport Methods, Codes, and Applications at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, John C; Peplow, Douglas E.; Mosher, Scott W

    2011-01-01

    This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(102-4), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less

  18. Hanford Environmental Dose Reconstruction Project. Monthly report, December 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finch, S.M.; McMakin, A.H.

    1991-12-31

    The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon and Washington, a representative of Native American tribes, and an individual representing the public.more » The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on human (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data; Demographics, Agriculture, Food Habits and; Environmental Pathways and Dose Estimates.« less

  19. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finch, S.M.; McMakin, A.H.

    1991-01-01

    The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon and Washington, a representative of Native American tribes, and an individual representing the public.more » The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on human (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data; Demographics, Agriculture, Food Habits and; Environmental Pathways and Dose Estimates.« less

  20. LM193 Dual Differential Comparator Total Ionizing Dose Test Report

    NASA Technical Reports Server (NTRS)

    Topper, Alyson; Forney, James; Campola, Michael

    2017-01-01

    The purpose of this test was to characterize the flight lot of Texas Instruments' LM193 (flight part number is 5962-9452601Q2A) for total dose response. This test served as the radiation lot acceptance test (RLAT) for the lot date code (LDC) tested. Low dose rate (LDR) irradiations were performed in this test so that the device susceptibility to enhanced low dose rate sensitivity (ELDRS) was determined.

  1. Modelling of aircrew radiation exposure from galactic cosmic rays and solar particle events.

    PubMed

    Takada, M; Lewis, B J; Boudreau, M; Al Anid, H; Bennett, L G I

    2007-01-01

    Correlations have been developed for implementation into the semi-empirical Predictive Code for Aircrew Radiation Exposure (PCAIRE) to account for effects of extremum conditions of solar modulation and low altitude based on transport code calculations. An improved solar modulation model, as proposed by NASA, has been further adopted to interpolate between the bounding correlations for solar modulation. The conversion ratio of effective dose to ambient dose equivalent, as applied to the PCAIRE calculation (based on measurements) for the legal regulation of aircrew exposure, was re-evaluated in this work to take into consideration new ICRP-92 radiation-weighting factors and different possible irradiation geometries of the source cosmic-radiation field. A computational analysis with Monte Carlo N-Particle eXtended Code was further used to estimate additional aircrew exposure that may result from sporadic solar energetic particle events considering real-time monitoring by the Geosynchronous Operational Environmental Satellite. These predictions were compared with the ambient dose equivalent rates measured on-board an aircraft and to count rate data observed at various ground-level neutron monitors.

  2. Analysis of neutron and gamma-ray streaming along the maze of NRCAM thallium production target room.

    PubMed

    Raisali, G; Hajiloo, N; Hamidi, S; Aslani, G

    2006-08-01

    Study of the shield performance of a thallium-203 production target room has been investigated in this work. Neutron and gamma-ray equivalent dose rates at various points of the maze are calculated by simulating the transport of streaming neutrons, and photons using Monte Carlo method. For determination of neutron and gamma-ray source intensities and their energy spectrum, we have applied SRIM 2003 and ALICE91 computer codes to Tl target and its Cu substrate for a 145 microA of 28.5 MeV protons beam. The MCNP/4C code has been applied with neutron source term in mode n p to consider both prompt neutrons and secondary gamma-rays. Then the code is applied for the prompt gamma-rays as the source term. The neutron-flux energy spectrum and equivalent dose rates for neutron and gamma-rays in various positions in the maze have been calculated. It has been found that the deviation between calculated and measured dose values along the maze is less than 20%.

  3. Maxdose-SR and popdose-SR routine release atmospheric dose models used at SRS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannik, G. T.; Trimor, P. P.

    MAXDOSE-SR and POPDOSE-SR are used to calculate dose to the offsite Reference Person and to the surrounding Savannah River Site (SRS) population respectively following routine releases of atmospheric radioactivity. These models are currently accessed through the Dose Model Version 2014 graphical user interface (GUI). MAXDOSE-SR and POPDOSE-SR are personal computer (PC) versions of MAXIGASP and POPGASP, which both resided on the SRS IBM Mainframe. These two codes follow U.S. Nuclear Regulatory Commission (USNRC) Regulatory Guides 1.109 and 1.111 (1977a, 1977b). The basis for MAXDOSE-SR and POPDOSE-SR are USNRC developed codes XOQDOQ (Sagendorf et. al 1982) and GASPAR (Eckerman et. almore » 1980). Both of these codes have previously been verified for use at SRS (Simpkins 1999 and 2000). The revisions incorporated into MAXDOSE-SR and POPDOSE-SR Version 2014 (hereafter referred to as MAXDOSE-SR and POPDOSE-SR unless otherwise noted) were made per Computer Program Modification Tracker (CPMT) number Q-CMT-A-00016 (Appendix D). Version 2014 was verified for use at SRS in Dixon (2014).« less

  4. A novel three-dimensional image reconstruction method for near-field coded aperture single photon emission computerized tomography

    PubMed Central

    Mu, Zhiping; Hong, Baoming; Li, Shimin; Liu, Yi-Hwa

    2009-01-01

    Coded aperture imaging for two-dimensional (2D) planar objects has been investigated extensively in the past, whereas little success has been achieved in imaging 3D objects using this technique. In this article, the authors present a novel method of 3D single photon emission computerized tomography (SPECT) reconstruction for near-field coded aperture imaging. Multiangular coded aperture projections are acquired and a stack of 2D images is reconstructed separately from each of the projections. Secondary projections are subsequently generated from the reconstructed image stacks based on the geometry of parallel-hole collimation and the variable magnification of near-field coded aperture imaging. Sinograms of cross-sectional slices of 3D objects are assembled from the secondary projections, and the ordered subset expectation and maximization algorithm is employed to reconstruct the cross-sectional image slices from the sinograms. Experiments were conducted using a customized capillary tube phantom and a micro hot rod phantom. Imaged at approximately 50 cm from the detector, hot rods in the phantom with diameters as small as 2.4 mm could be discerned in the reconstructed SPECT images. These results have demonstrated the feasibility of the authors’ 3D coded aperture image reconstruction algorithm for SPECT, representing an important step in their effort to develop a high sensitivity and high resolution SPECT imaging system. PMID:19544769

  5. Probabilistic dose assessment of normal operations and accident conditions for an assured isolation facility in Texas

    NASA Astrophysics Data System (ADS)

    Arno, Matthew Gordon

    Texas is investigating building a long-term waste storage facility, also known as an Assured Isolation Facility. This is an above-ground low-level radioactive waste storage facility that is actively maintained and from which waste may be retrieved. A preliminary, scoping-level analysis has been extended to consider more complex scenarios of radiation streaming and skyshine by using the computer code Monte Carlo N-Particle (MCNP) to model the facility in greater detail. Accidental release scenarios have been studied in more depth to better assess the potential dose to off-site individuals. Using bounding source term assumptions, the projected radiation doses and dose rates are estimated to exceed applicable limits by an order of magnitude. By altering the facility design to fill in the hollow cores of the prefabricated concrete slabs used in the roof over the "high-gamma rooms," where the waste with the highest concentration of gamma emitting radioactive material is stored, dose rates outside the facility decrease by an order of magnitude. With the modified design, the annual dose at the site fenceline is estimated at 86 mrem, below the 100 mrem annual limit for exposure of the public. Within the site perimeter, the dose rates are lowered sufficiently such that it is not necessary to categorize many workers and contractor personnel as radiation workers, saving on costs as well as being advisable under ALARA principles. A detailed analysis of bounding accidents incorporating information on the local meteorological conditions indicate that the maximum committed effective dose equivalent from the passage of a plume of material released in an accident at any of the cities near the facility is 59 :rem in the city of Eunice, NM based on the combined day and night meteorological conditions. Using the daytime meteorological conditions, the maximum dose at any city is 7 :rem, also in the city of Eunice. The maximum dose at the site boundary was determined to be 230 mrem using the combined day and night meteorological conditions and 33 mrem using the daytime conditions.

  6. A new code for Galileo

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1988-01-01

    Over the past six to eight years, an extensive research effort was conducted to investigate advanced coding techniques which promised to yield more coding gain than is available with current NASA standard codes. The delay in Galileo's launch due to the temporary suspension of the shuttle program provided the Galileo project with an opportunity to evaluate the possibility of including some version of the advanced codes as a mission enhancement option. A study was initiated last summer to determine if substantial coding gain was feasible for Galileo and, is so, to recommend a suitable experimental code for use as a switchable alternative to the current NASA-standard code. The Galileo experimental code study resulted in the selection of a code with constant length 15 and rate 1/4. The code parameters were chosen to optimize performance within cost and risk constraints consistent with retrofitting the new code into the existing Galileo system design and launch schedule. The particular code was recommended after a very limited search among good codes with the chosen parameters. It will theoretically yield about 1.5 dB enhancement under idealizing assumptions relative to the current NASA-standard code at Galileo's desired bit error rates. This ideal predicted gain includes enough cushion to meet the project's target of at least 1 dB enhancement under real, non-ideal conditions.

  7. Reducing statistical uncertainties in simulated organ doses of phantoms immersed in water

    DOE PAGES

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.; ...

    2016-08-13

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  8. 78 FR 23497 - Propiconazole; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-19

    ...). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS.... Aggregate Risk Assessment and Determination of Safety Section 408(b)(2)(A)(i) of FFDCA allows EPA to... dose at which adverse effects of concern are identified (the LOAEL). Uncertainty/safety factors are...

  9. FIIDOS--A Computer Code for the Computation of Fallout Inhalation and Ingestion Dose to Organs Computer User’s Guide (Revision 4)

    DTIC Science & Technology

    2007-05-01

    35 5 Actinide product radionuclides... actinides , and fission products in fallout. Doses from low-linear energy transfer (LET) radiation (beta particles and gamma rays) are reported separately...assumptions about the critical parameters used in calculating internal doses – resuspension factor, breathing rate, fractionation, and scenario elements – to

  10. Orthovoltage radiation therapy treatment planning using Monte Carlo simulation: treatment of neuroendocrine carcinoma of the maxillary sinus

    NASA Astrophysics Data System (ADS)

    Gao, Wanbao; Raeside, David E.

    1997-12-01

    Dose distributions that result from treating a patient with orthovoltage beams are best determined with a treatment planning system that uses the Monte Carlo method, and such systems are not readily available. In the present work, the Monte Carlo method was used to develop a computer code for determining absorbed dose distributions in orthovoltage radiation therapy. The code was used in planning treatment of a patient with a neuroendocrine carcinoma of the maxillary sinus. Two lateral high-energy photon beams supplemented by an anterior orthovoltage photon beam were utilized in the treatment plan. For the clinical case and radiation beams considered, a reasonably uniform dose distribution is achieved within the target volume, while the dose to the lens of each eye is 4 - 8% of the prescribed dose. Therefore, an orthovoltage photon beam, when properly filtered and optimally combined with megavoltage beams, can be effective in the treatment of cancers below the skin, providing that accurate treatment planning is carried out to establish with accuracy and precision the doses to critical structures.

  11. An Approach in Radiation Therapy Treatment Planning: A Fast, GPU-Based Monte Carlo Method.

    PubMed

    Karbalaee, Mojtaba; Shahbazi-Gahrouei, Daryoush; Tavakoli, Mohammad B

    2017-01-01

    An accurate and fast radiation dose calculation is essential for successful radiation radiotherapy. The aim of this study was to implement a new graphic processing unit (GPU) based radiation therapy treatment planning for accurate and fast dose calculation in radiotherapy centers. A program was written for parallel running based on GPU. The code validation was performed by EGSnrc/DOSXYZnrc. Moreover, a semi-automatic, rotary, asymmetric phantom was designed and produced using a bone, the lung, and the soft tissue equivalent materials. All measurements were performed using a Mapcheck dosimeter. The accuracy of the code was validated using the experimental data, which was obtained from the anthropomorphic phantom as the gold standard. The findings showed that, compared with those of DOSXYZnrc in the virtual phantom and for most of the voxels (>95%), <3% dose-difference or 3 mm distance-to-agreement (DTA) was found. Moreover, considering the anthropomorphic phantom, compared to the Mapcheck dose measurements, <5% dose-difference or 5 mm DTA was observed. Fast calculation speed and high accuracy of GPU-based Monte Carlo method in dose calculation may be useful in routine radiation therapy centers as the core and main component of a treatment planning verification system.

  12. Noise correlation in CBCT projection data and its application for noise reduction in low-dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hua; Ouyang, Luo; Wang, Jing, E-mail: jhma@smu.edu.cn, E-mail: jing.wang@utsouthwestern.edu

    2014-03-15

    Purpose: To study the noise correlation properties of cone-beam CT (CBCT) projection data and to incorporate the noise correlation information to a statistics-based projection restoration algorithm for noise reduction in low-dose CBCT. Methods: In this study, the authors systematically investigated the noise correlation properties among detector bins of CBCT projection data by analyzing repeated projection measurements. The measurements were performed on a TrueBeam onboard CBCT imaging system with a 4030CB flat panel detector. An anthropomorphic male pelvis phantom was used to acquire 500 repeated projection data at six different dose levels from 0.1 to 1.6 mAs per projection at threemore » fixed angles. To minimize the influence of the lag effect, lag correction was performed on the consecutively acquired projection data. The noise correlation coefficient between detector bin pairs was calculated from the corrected projection data. The noise correlation among CBCT projection data was then incorporated into the covariance matrix of the penalized weighted least-squares (PWLS) criterion for noise reduction of low-dose CBCT. Results: The analyses of the repeated measurements show that noise correlation coefficients are nonzero between the nearest neighboring bins of CBCT projection data. The average noise correlation coefficients for the first- and second-order neighbors are 0.20 and 0.06, respectively. The noise correlation coefficients are independent of the dose level. Reconstruction of the pelvis phantom shows that the PWLS criterion with consideration of noise correlation (PWLS-Cor) results in a lower noise level as compared to the PWLS criterion without considering the noise correlation (PWLS-Dia) at the matched resolution. At the 2.0 mm resolution level in the axial-plane noise resolution tradeoff analysis, the noise level of the PWLS-Cor reconstruction is 6.3% lower than that of the PWLS-Dia reconstruction. Conclusions: Noise is correlated among nearest neighboring detector bins of CBCT projection data. An accurate noise model of CBCT projection data can improve the performance of the statistics-based projection restoration algorithm for low-dose CBCT.« less

  13. Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*

    PubMed Central

    Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab

    2006-01-01

    This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546

  14. Restored low-dose digital breast tomosynthesis: a perception study

    NASA Astrophysics Data System (ADS)

    Borges, Lucas R.; Bakic, Predrag R.; Maidment, Andrew D. A.; Vieira, Marcelo A. C.

    2018-03-01

    This work investigates the perception of noise from restored low-dose digital breast tomosynthesis (DBT) images. First, low-dose DBT projections were generated using a dose reduction simulation algorithm. A dataset of clinical images from the Hospital of the University of Pennsylvania was used for this purpose. Low-dose projections were then denoised with a denoising pipeline developed specifically for DBT images. Denoised and noisy projections were combined to generate images with signal-to-noise ratio comparable to the full-dose images. The quality of restored low-dose and full-dose projections were first compared in terms of an objective no-reference image quality metric previously validated for mammography. In the second analysis, regions of interest (ROIs) were selected from reconstructed full-dose and restored low-dose slices, and were displayed side-by-side on a high-resolution medical display. Five medical physics specialists were asked to choose the image containing less noise and less blur using a 2-AFC experiment. The objective metric shows that, after the proposed image restoration framework was applied, images with as little as 60% of the AEC dose yielded similar quality indices when compared to images acquired with the full-dose. In the 2-AFC experiments results showed that when the denoising framework was used, 30% reduction in dose was possible without any perceived difference in noise or blur. Note that this study evaluated the observers perception to noise and blur and does not claim that the dose of DBT examinations can be reduced with no harm to the detection of cancer. Future work is necessary to make any claims regarding detection, localization and characterization of lesions.

  15. Radiation exposure for manned Mars surface missions

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  16. Comparison of cosmic rays radiation detectors on-board commercial jet aircraft.

    PubMed

    Kubančák, Ján; Ambrožová, Iva; Brabcová, Kateřina Pachnerová; Jakůbek, Jan; Kyselová, Dagmar; Ploc, Ondřej; Bemš, Július; Štěpán, Václav; Uchihori, Yukio

    2015-06-01

    Aircrew members and passengers are exposed to increased rates of cosmic radiation on-board commercial jet aircraft. The annual effective doses of crew members often exceed limits for public, thus it is recommended to monitor them. In general, the doses are estimated via various computer codes and in some countries also verified by measurements. This paper describes a comparison of three cosmic rays detectors, namely of the (a) HAWK Tissue Equivalent Proportional Counter; (b) Liulin semiconductor energy deposit spectrometer and (c) TIMEPIX silicon semiconductor pixel detector, exposed to radiation fields on-board commercial Czech Airlines company jet aircraft. Measurements were performed during passenger flights from Prague to Madrid, Oslo, Tbilisi, Yekaterinburg and Almaty, and back in July and August 2011. For all flights, energy deposit spectra and absorbed doses are presented. Measured absorbed dose and dose equivalent are compared with the EPCARD code calculations. Finally, the advantages and disadvantages of all detectors are discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Neutron emission and dose distribution from natural carbon irradiated with a 12 MeV amu-1 12C5+ ion beam.

    PubMed

    Nandy, Maitreyee; Sarkar, P K; Sanami, T; Takada, M; Shibata, T

    2016-09-01

    Measured neutron energy distribution emitted from a thick stopping target of natural carbon at 0°, 30°, 60° and 90° from nuclear reactions caused by 12 MeV amu -1 incident 12 C 5+ ions were converted to energy differential and total neutron absorbed dose as well as ambient dose equivalent H * (10) using the fluence-to-dose conversion coefficients provided by the ICRP. Theoretical estimates were obtained using the Monte Carlo nuclear reaction model code PACE and a few existing empirical formulations for comparison. Results from the PACE code showed an underestimation of the high-energy part of energy differential dose distributions at forward angles whereas the empirical formulation by Clapier and Zaidins (1983 Nucl. Instrum. Methods 217 489-94) approximated the energy integrated angular distribution of H * (10) satisfactorily. Using the measured data, the neutron doses received by some vital human organs were estimated for anterior-posterior exposure. The estimated energy-averaged quality factors were found to vary for different organs from about 7 to about 13. Emitted neutrons having energies above 20 MeV were found to contribute about 20% of the total dose at 0° while at 90° the contribution was reduced to about 2%.

  18. Estimation of dose rates at the entrance surface for exposure scenarios of total body irradiation using MCNPX code

    NASA Astrophysics Data System (ADS)

    Cunha, J. S.; Cavalcante, F. R.; Souza, S. O.; Souza, D. N.; Santos, W. S.; Carvalho Júnior, A. B.

    2017-11-01

    One of the main criteria that must be held in Total Body Irradiation (TBI) is the uniformity of dose in the body. In TBI procedures the certification that the prescribed doses are absorbed in organs is made with dosimeters positioned on the patient skin. In this work, we modelled TBI scenarios in the MCNPX code to estimate the entrance dose rate in the skin for comparison and validation of simulations with experimental measurements from literature. Dose rates were estimated simulating an ionization chamber laterally positioned on thorax, abdomen, leg and thigh. Four exposure scenarios were simulated: ionization chamber (S1), TBI room (S2), and patient represented by hybrid phantom (S3) and water stylized phantom (S4) in sitting posture. The posture of the patient in experimental work was better represented by S4 compared with hybrid phantom, and this led to minimum and maximum percentage differences of 1.31% and 6.25% to experimental measurements for thorax and thigh regions, respectively. As for all simulations reported here the percentage differences in the estimated dose rates were less than 10%, we considered that the obtained results are consistent with experimental measurements and the modelled scenarios are suitable to estimate the absorbed dose in organs during TBI procedure.

  19. Application of the MCNP5 code to the Modeling of vaginal and intra-uterine applicators used in intracavitary brachytherapy: a first approach

    NASA Astrophysics Data System (ADS)

    Gerardy, I.; Rodenas, J.; Van Dycke, M.; Gallardo, S.; Tondeur, F.

    2008-02-01

    Brachytherapy is a radiotherapy treatment where encapsulated radioactive sources are introduced within a patient. Depending on the technique used, such sources can produce high, medium or low local dose rates. The Monte Carlo method is a powerful tool to simulate sources and devices in order to help physicists in treatment planning. In multiple types of gynaecological cancer, intracavitary brachytherapy (HDR Ir-192 source) is used combined with other therapy treatment to give an additional local dose to the tumour. Different types of applicators are used in order to increase the dose imparted to the tumour and to limit the effect on healthy surrounding tissues. The aim of this work is to model both applicator and HDR source in order to evaluate the dose at a reference point as well as the effect of the materials constituting the applicators on the near field dose. The MCNP5 code based on the Monte Carlo method has been used for the simulation. Dose calculations have been performed with *F8 energy deposition tally, taking into account photons and electrons. Results from simulation have been compared with experimental in-phantom dose measurements. Differences between calculations and measurements are lower than 5%.The importance of the source position has been underlined.

  20. SU-E-I-15: Comparison of Radiation Dose for Radiography and EOS in Adolescent Scoliosis Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schueler, B; Walz-Flannigan, A

    Purpose: To estimate patient radiation dose for whole spine imaging using EOS, a new biplanar slot-scanning radiographic system and compare with standard scoliosis radiography. Methods: The EOS imaging system (EOS Imaging, Paris, France) consists of two orthogonal x-ray fan beams which simultaneously acquire frontal and lateral projection images of a standing patient. The patient entrance skin air kerma was measured for each projection image using manufacturer-recommended exposure parameters for spine imaging. Organ and effective doses were estimated using a commercially-available Monte Carlo simulation program (PCXMC, STUK, Radiation and Nuclear Safety Authority, Helsinki, Finland) for a 15 year old mathematical phantommore » model. These results were compared to organ and effective dose estimated for scoliosis radiography using computed radiography (CR) with standard exposure parameters obtained from a survey of pediatric radiographic projections. Results: The entrance skin air kerma for EOS was found to be 0.18 mGy and 0.33 mGy for posterior-anterior (PA) and lateral projections, respectively. This compares to 0.76 mGy and 1.4 mGy for CR, PA and lateral projections. Effective dose for EOS (PA and lateral projections combined) is 0.19 mSv compared to 0.51 mSv for CR. Conclusion: The EOS slot-scanning radiographic system allows for reduced patient radiation dose in scoliosis patients as compared to standard CR radiography.« less

  1. Hanford Environmental Dose Reconstruction Project monthly report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMakin, A.H., Cannon, S.D.; Finch, S.M.

    1992-09-01

    The objective of the Hanford Environmental Dose Reconstruction MDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in envirorunental pathways. epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering. radiation dosimetry. and cultural anthropology. Included are appointed members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact onmore » humans (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data Demography, Food Consumption, and Agriculture; and Environmental Pathways and Dose Estimates.« less

  2. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physicalmore » principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as virtual experiments that give deeper and long lasting understanding of core principles. The student can then make sound judgements in novel situations encountered beyond routine clinical activities.« less

  3. A dosimetry study comparing NCS report-5, IAEA TRS-381, AAPM TG-51 and IAEA TRS-398 in three clinical electron beam energies

    NASA Astrophysics Data System (ADS)

    Palmans, Hugo; Nafaa, Laila; de Patoul, Nathalie; Denis, Jean-Marc; Tomsej, Milan; Vynckier, Stefaan

    2003-05-01

    New codes of practice for reference dosimetry in clinical high-energy photon and electron beams have been published recently, to replace the air kerma based codes of practice that have determined the dosimetry of these beams for the past twenty years. In the present work, we compared dosimetry based on the two most widespread absorbed dose based recommendations (AAPM TG-51 and IAEA TRS-398) with two air kerma based recommendations (NCS report-5 and IAEA TRS-381). Measurements were performed in three clinical electron beam energies using two NE2571-type cylindrical chambers, two Markus-type plane-parallel chambers and two NACP-02-type plane-parallel chambers. Dosimetry based on direct calibrations of all chambers in 60Co was investigated, as well as dosimetry based on cross-calibrations of plane-parallel chambers against a cylindrical chamber in a high-energy electron beam. Furthermore, 60Co perturbation factors for plane-parallel chambers were derived. It is shown that the use of 60Co calibration factors could result in deviations of more than 2% for plane-parallel chambers between the old and new codes of practice, whereas the use of cross-calibration factors, which is the first recommendation in the new codes, reduces the differences to less than 0.8% for all situations investigated here. The results thus show that neither the chamber-to-chamber variations, nor the obtained absolute dose values are significantly altered by changing from air kerma based dosimetry to absorbed dose based dosimetry when using calibration factors obtained from the Laboratory for Standard Dosimetry, Ghent, Belgium. The values of the 60Co perturbation factor for plane-parallel chambers (katt . km for the air kerma based and pwall for the absorbed dose based codes of practice) that are obtained from comparing the results based on 60Co calibrations and cross-calibrations are within the experimental uncertainties in agreement with the results from other investigators.

  4. Coded mask telescopes for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Skinner, G. K.; Ponman, T. J.

    1987-04-01

    The principle of the coded mask techniques are discussed together with the methods of image reconstruction. The coded mask telescopes built at the University of Birmingham, including the SL 1501 coded mask X-ray telescope flown on the Skylark rocket and the Coded Mask Imaging Spectrometer (COMIS) projected for the Soviet space station Mir, are described. A diagram of a coded mask telescope and some designs for coded masks are included.

  5. A comprehensive study on the relationship between the image quality and imaging dose in low-dose cone beam CT

    NASA Astrophysics Data System (ADS)

    Yan, Hao; Cervino, Laura; Jia, Xun; Jiang, Steve B.

    2012-04-01

    While compressed sensing (CS)-based algorithms have been developed for the low-dose cone beam CT (CBCT) reconstruction, a clear understanding of the relationship between the image quality and imaging dose at low-dose levels is needed. In this paper, we qualitatively investigate this subject in a comprehensive manner with extensive experimental and simulation studies. The basic idea is to plot both the image quality and imaging dose together as functions of the number of projections and mAs per projection over the whole clinically relevant range. On this basis, a clear understanding of the tradeoff between the image quality and imaging dose can be achieved and optimal low-dose CBCT scan protocols can be developed to maximize the dose reduction while minimizing the image quality loss for various imaging tasks in image-guided radiation therapy (IGRT). Main findings of this work include (1) under the CS-based reconstruction framework, image quality has little degradation over a large range of dose variation. Image quality degradation becomes evident when the imaging dose (approximated with the x-ray tube load) is decreased below 100 total mAs. An imaging dose lower than 40 total mAs leads to a dramatic image degradation, and thus should be used cautiously. Optimal low-dose CBCT scan protocols likely fall in the dose range of 40-100 total mAs, depending on the specific IGRT applications. (2) Among different scan protocols at a constant low-dose level, the super sparse-view reconstruction with the projection number less than 50 is the most challenging case, even with strong regularization. Better image quality can be acquired with low mAs protocols. (3) The optimal scan protocol is the combination of a medium number of projections and a medium level of mAs/view. This is more evident when the dose is around 72.8 total mAs or below and when the ROI is a low-contrast or high-resolution object. Based on our results, the optimal number of projections is around 90 to 120. (4) The clinically acceptable lowest imaging dose level is task dependent. In our study, 72.8 mAs is a safe dose level for visualizing low-contrast objects, while 12.2 total mAs is sufficient for detecting high-contrast objects of diameter greater than 3 mm.

  6. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    NASA Astrophysics Data System (ADS)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  7. ARCHER{sub RT} – A GPU-based and photon-electron coupled Monte Carlo dose computing engine for radiation therapy: Software development and application to helical tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Lin; Du, Xining; Liu, Tianyu

    Purpose: Using the graphical processing units (GPU) hardware technology, an extremely fast Monte Carlo (MC) code ARCHER{sub RT} is developed for radiation dose calculations in radiation therapy. This paper describes the detailed software development and testing for three clinical TomoTherapy® cases: the prostate, lung, and head and neck. Methods: To obtain clinically relevant dose distributions, phase space files (PSFs) created from optimized radiation therapy treatment plan fluence maps were used as the input to ARCHER{sub RT}. Patient-specific phantoms were constructed from patient CT images. Batch simulations were employed to facilitate the time-consuming task of loading large PSFs, and to improvemore » the estimation of statistical uncertainty. Furthermore, two different Woodcock tracking algorithms were implemented and their relative performance was compared. The dose curves of an Elekta accelerator PSF incident on a homogeneous water phantom were benchmarked against DOSXYZnrc. For each of the treatment cases, dose volume histograms and isodose maps were produced from ARCHER{sub RT} and the general-purpose code, GEANT4. The gamma index analysis was performed to evaluate the similarity of voxel doses obtained from these two codes. The hardware accelerators used in this study are one NVIDIA K20 GPU, one NVIDIA K40 GPU, and six NVIDIA M2090 GPUs. In addition, to make a fairer comparison of the CPU and GPU performance, a multithreaded CPU code was developed using OpenMP and tested on an Intel E5-2620 CPU. Results: For the water phantom, the depth dose curve and dose profiles from ARCHER{sub RT} agree well with DOSXYZnrc. For clinical cases, results from ARCHER{sub RT} are compared with those from GEANT4 and good agreement is observed. Gamma index test is performed for voxels whose dose is greater than 10% of maximum dose. For 2%/2mm criteria, the passing rates for the prostate, lung case, and head and neck cases are 99.7%, 98.5%, and 97.2%, respectively. Due to specific architecture of GPU, modified Woodcock tracking algorithm performed inferior to the original one. ARCHER{sub RT} achieves a fast speed for PSF-based dose calculations. With a single M2090 card, the simulations cost about 60, 50, 80 s for three cases, respectively, with the 1% statistical error in the PTV. Using the latest K40 card, the simulations are 1.7–1.8 times faster. More impressively, six M2090 cards could finish the simulations in 8.9–13.4 s. For comparison, the same simulations on Intel E5-2620 (12 hyperthreading) cost about 500–800 s. Conclusions: ARCHER{sub RT} was developed successfully to perform fast and accurate MC dose calculation for radiotherapy using PSFs and patient CT phantoms.« less

  8. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  9. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  10. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  11. SU-E-T-561: Monte Carlo-Based Organ Dose Reconstruction Using Pre-Contoured Human Model for Hodgkins Lymphoma Patients Treated by Cobalt-60 External Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, J; Pelletier, C; Lee, C

    Purpose: Organ doses for the Hodgkin’s lymphoma patients treated with cobalt-60 radiation were estimated using an anthropomorphic model and Monte Carlo modeling. Methods: A cobalt-60 treatment unit modeled in the BEAMnrc Monte Carlo code was used to produce phase space data. The Monte Carlo simulation was verified with percent depth dose measurement in water at various field sizes. Radiation transport through the lung blocks were modeled by adjusting the weights of phase space data. We imported a precontoured adult female hybrid model and generated a treatment plan. The adjusted phase space data and the human model were imported to themore » XVMC Monte Carlo code for dose calculation. The organ mean doses were estimated and dose volume histograms were plotted. Results: The percent depth dose agreement between measurement and calculation in water phantom was within 2% for all field sizes. The mean organ doses of heart, left breast, right breast, and spleen for the selected case were 44.3, 24.1, 14.6 and 3.4 Gy, respectively with the midline prescription dose of 40.0 Gy. Conclusion: Organ doses were estimated for the patient group whose threedimensional images are not available. This development may open the door to more accurate dose reconstruction and estimates of uncertainties in secondary cancer risk for Hodgkin’s lymphoma patients. This work was partially supported by the intramural research program of the National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics.« less

  12. IMPROVEMENT OF EXPOSURE-DOSE MODELS: APPLICATION OF CONTINUOUS BREATH SAMPLING TO DETERMINE VOC DOSE AND BODY BURDEN

    EPA Science Inventory

    This is a continuation of an Internal Grant research project with the focus on completing the research due to initial funding delays and then analyzing and reporting the research results. This project will employ a new continuous breath sampling methodology to investigate dose a...

  13. Estimation of Effective Doses for Radiation Cancer Risks on ISS, Lunar, and Mars Missions with Space Radiation Measurement

    NASA Technical Reports Server (NTRS)

    Kim, M.Y.; Cucinotta, F.A.

    2005-01-01

    Radiation protection practices define the effective dose as a weighted sum of equivalent dose over major sites for radiation cancer risks. Since a crew personnel dosimeter does not make direct measurement of effective dose, it has been estimated with skin-dose measurements and radiation transport codes for ISS and STS missions. The Phantom Torso Experiment (PTE) of NASA s Operational Radiation Protection Program has provided the actual flight measurements of active and passive dosimeters which were placed throughout the phantom on STS-91 mission for 10 days and on ISS Increment 2 mission. For the PTE, the variation in organ doses, which is resulted by the absorption and the changes in radiation quality with tissue shielding, was considered by measuring doses at many tissue sites and at several critical body organs including brain, colon, heart, stomach, thyroid, and skins. These measurements have been compared with the organ dose calculations obtained from the transport models. Active TEPC measurements of lineal energy spectra at the surface of the PTE also provided the direct comparison of galactic cosmic ray (GCR) or trapped proton dose and dose equivalent. It is shown that orienting the phantom body as actual in ISS is needed for the direct comparison of the transport models to the ISS data. One of the most important observations for organ dose equivalent of effective dose estimates on ISS is the fractional contribution from trapped protons and GCR. We show that for most organs over 80% is from GCR. The improved estimation of effective doses for radiation cancer risks will be made with the resultant tissue weighting factors and the modified codes.

  14. PATIENT RADIATION DOSE FROM CHEST X-RAY EXAMINATIONS IN THE WEST BANK-PALESTINE.

    PubMed

    Lahham, Adnan; Issa, Ahlam; ALMasri, Hussein

    2018-02-01

    Radiation doses to patients resulting from chest X-ray examinations were evaluated in four medical centers in the West Bank and East Jerusalem-Palestine. Absorbed organ and effective doses were calculated for a total of 428 adult male and female patients by using commercially available Monte Carlo based softwares; CALDOSE-X5 and PCXMC-2.0, and hermaphrodite mathematical adult phantoms. Patients were selected randomly from medical records in the time period from November 2014 to February 2015. A database of surveyed patients and exposure factors has been established and includes: patient's height, weight, age, gender, X-ray tube voltage, electric current (mAs), examination projection (anterior posterior (AP), posterior anterior (PA), lateral), X-ray tube filtration thickness in each X-ray equipment, anode angle, focus to skin distance and X-ray beam size. The average absorbed doses in the whole body from different projections were: 0.06, 0.07 and 0.11 mGy from AP, PA and lateral projections, respectively. The average effective dose for all surveyed patients was 0.14 mSv for all chest X-ray examinations and projections in the four investigated medical centers. The effect of projection geometry was also investigated. The average effective doses for AP, PA and lateral projections were 0.14, 0.07 and 0.22 mSv, respectively. The collective effective dose estimated for the exposed population was ~60 man-mSv. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user`s guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACSmore » is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway.« less

  16. Simulations of neutron transport at low energy: a comparison between GEANT and MCNP.

    PubMed

    Colonna, N; Altieri, S

    2002-06-01

    The use of the simulation tool GEANT for neutron transport at energies below 20 MeV is discussed, in particular with regard to shielding and dose calculations. The reliability of the GEANT/MICAP package for neutron transport in a wide energy range has been verified by comparing the results of simulations performed with this package in a wide energy range with the prediction of MCNP-4B, a code commonly used for neutron transport at low energy. A reasonable agreement between the results of the two codes is found for the neutron flux through a slab of material (iron and ordinary concrete), as well as for the dose released in soft tissue by neutrons. These results justify the use of the GEANT/MICAP code for neutron transport in a wide range of applications, including health physics problems.

  17. Inverse determination of the penalty parameter in penalized weighted least-squares algorithm for noise reduction of low-dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Guan, Huaiqun; Solberg, Timothy

    2011-07-15

    Purpose: A statistical projection restoration algorithm based on the penalized weighted least-squares (PWLS) criterion can substantially improve the image quality of low-dose CBCT images. The performance of PWLS is largely dependent on the choice of the penalty parameter. Previously, the penalty parameter was chosen empirically by trial and error. In this work, the authors developed an inverse technique to calculate the penalty parameter in PWLS for noise suppression of low-dose CBCT in image guided radiotherapy (IGRT). Methods: In IGRT, a daily CBCT is acquired for the same patient during a treatment course. In this work, the authors acquired the CBCTmore » with a high-mAs protocol for the first session and then a lower mAs protocol for the subsequent sessions. The high-mAs projections served as the goal (ideal) toward, which the low-mAs projections were to be smoothed by minimizing the PWLS objective function. The penalty parameter was determined through an inverse calculation of the derivative of the objective function incorporating both the high and low-mAs projections. Then the parameter obtained can be used for PWLS to smooth the noise in low-dose projections. CBCT projections for a CatPhan 600 and an anthropomorphic head phantom, as well as for a brain patient, were used to evaluate the performance of the proposed technique. Results: The penalty parameter in PWLS was obtained for each CBCT projection using the proposed strategy. The noise in the low-dose CBCT images reconstructed from the smoothed projections was greatly suppressed. Image quality in PWLS-processed low-dose CBCT was comparable to its corresponding high-dose CBCT. Conclusions: A technique was proposed to estimate the penalty parameter for PWLS algorithm. It provides an objective and efficient way to obtain the penalty parameter for image restoration algorithms that require predefined smoothing parameters.« less

  18. Radionuclide production and dose rate estimation during the commissioning of the W-Ta spallation target

    NASA Astrophysics Data System (ADS)

    Yu, Q. Z.; Liang, T. J.

    2018-06-01

    China Spallation Neutron Source (CSNS) is intended to begin operation in 2018. CSNS is an accelerator-base multidisciplinary user facility. The pulsed neutrons are produced by a 1.6GeV short-pulsed proton beam impinging on a W-Ta spallation target, at a beam power of100 kW and a repetition rate of 25 Hz. 20 neutron beam lines are extracted for the neutron scattering and neutron irradiation research. During the commissioning and maintenance scenarios, the gamma rays induced from the W-Ta target can cause the dose threat to the personal and the environment. In this paper, the gamma dose rate distributions for the W-Ta spallation are calculated, based on the engineering model of the target-moderator-reflector system. The shipping cask is analyzed to satisfy the dose rate limit that less than 2 mSv/h at the surface of the shipping cask. All calculations are performed by the Monte carlo code MCNPX2.5 and the activation code CINDER’90.

  19. Monte Carlo simulation of electron beams from an accelerator head using PENELOPE.

    PubMed

    Sempau, J; Sánchez-Reyes, A; Salvat, F; ben Tahar, H O; Jiang, S B; Fernández-Varea, J M

    2001-04-01

    The Monte Carlo code PENELOPE has been used to simulate electron beams from a Siemens Mevatron KDS linac with nominal energies of 6, 12 and 18 MeV. Owing to its accuracy, which stems from that of the underlying physical interaction models, PENELOPE is suitable for simulating problems of interest to the medical physics community. It includes a geometry package that allows the definition of complex quadric geometries, such as those of irradiation instruments, in a straightforward manner. Dose distributions in water simulated with PENELOPE agree well with experimental measurements using a silicon detector and a monitoring ionization chamber. Insertion of a lead slab in the incident beam at the surface of the water phantom produces sharp variations in the dose distributions, which are correctly reproduced by the simulation code. Results from PENELOPE are also compared with those of equivalent simulations with the EGS4-based user codes BEAM and DOSXYZ. Angular and energy distributions of electrons and photons in the phase-space plane (at the downstream end of the applicator) obtained from both simulation codes are similar, although significant differences do appear in some cases. These differences, however, are shown to have a negligible effect on the calculated dose distributions. Various practical aspects of the simulations, such as the calculation of statistical uncertainties and the effect of the 'latent' variance in the phase-space file, are discussed in detail.

  20. Best practice in the management of clinical coding services: Insights from a project in the Republic of Ireland, Part 2.

    PubMed

    Reid, Beth A; Ridoutt, Lee; O'Connor, Paul; Murphy, Deirdre

    2017-09-01

    This is the second of two articles about best practice in the management of coding services. The best practice project was part of a year-long project conducted in the Republic of Ireland to review the quality of the Hospital Inpatient Enquiry data for its use in activity-based funding. The four methods used to address the best practice aspect of the project were described in detail in Part 1. The results included in this article are those relating to the coding manager's background, preparation and style, clinical coder (CC) workforce adequacy, the CC workforce structure and career pathway, and the physical and psychological work environment for the clinical coding service. Examples of best practice were found in the study hospitals but there were also areas for improvement. Coding managers would benefit from greater support in the form of increased opportunities for management training and a better method for calculating CC workforce numbers. A career pathway is needed for CCs to progress from entry to expert CC, mentor, manager and quality controller. Most hospitals could benefit from investment in infrastructure that places CCs in a physical environment that tells them they are an important part of the hospital and their work is valued.

  1. 42 CFR 137.368 - Is the Secretary responsible for oversight and compliance of health and safety codes during...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... compliance of health and safety codes during construction projects being performed by a Self-Governance Tribe... SERVICES TRIBAL SELF-GOVERNANCE Construction Roles of the Secretary in Establishing and Implementing Construction Project Agreements § 137.368 Is the Secretary responsible for oversight and compliance of health...

  2. Fred: a GPU-accelerated fast-Monte Carlo code for rapid treatment plan recalculation in ion beam therapy

    NASA Astrophysics Data System (ADS)

    Schiavi, A.; Senzacqua, M.; Pioli, S.; Mairani, A.; Magro, G.; Molinelli, S.; Ciocca, M.; Battistoni, G.; Patera, V.

    2017-09-01

    Ion beam therapy is a rapidly growing technique for tumor radiation therapy. Ions allow for a high dose deposition in the tumor region, while sparing the surrounding healthy tissue. For this reason, the highest possible accuracy in the calculation of dose and its spatial distribution is required in treatment planning. On one hand, commonly used treatment planning software solutions adopt a simplified beam-body interaction model by remapping pre-calculated dose distributions into a 3D water-equivalent representation of the patient morphology. On the other hand, Monte Carlo (MC) simulations, which explicitly take into account all the details in the interaction of particles with human tissues, are considered to be the most reliable tool to address the complexity of mixed field irradiation in a heterogeneous environment. However, full MC calculations are not routinely used in clinical practice because they typically demand substantial computational resources. Therefore MC simulations are usually only used to check treatment plans for a restricted number of difficult cases. The advent of general-purpose programming GPU cards prompted the development of trimmed-down MC-based dose engines which can significantly reduce the time needed to recalculate a treatment plan with respect to standard MC codes in CPU hardware. In this work, we report on the development of fred, a new MC simulation platform for treatment planning in ion beam therapy. The code can transport particles through a 3D voxel grid using a class II MC algorithm. Both primary and secondary particles are tracked and their energy deposition is scored along the trajectory. Effective models for particle-medium interaction have been implemented, balancing accuracy in dose deposition with computational cost. Currently, the most refined module is the transport of proton beams in water: single pencil beam dose-depth distributions obtained with fred agree with those produced by standard MC codes within 1-2% of the Bragg peak in the therapeutic energy range. A comparison with measurements taken at the CNAO treatment center shows that the lateral dose tails are reproduced within 2% in the field size factor test up to 20 cm. The tracing kernel can run on GPU hardware, achieving 10 million primary s-1 on a single card. This performance allows one to recalculate a proton treatment plan at 1% of the total particles in just a few minutes.

  3. Estimation of the influence of radical effect in the proton beams using a combined approach with physical data and gel data

    NASA Astrophysics Data System (ADS)

    Haneda, K.

    2016-04-01

    The purpose of this study was to estimate an impact on radical effect in the proton beams using a combined approach with physical data and gel data. The study used two dosimeters: ionization chambers and polymer gel dosimeters. Polymer gel dosimeters have specific advantages when compared to other dosimeters. They can measure chemical reaction and they are at the same time a phantom that can map in three dimensions continuously and easily. First, a depth-dose curve for a 210 MeV proton beam measured using an ionization chamber and a gel dosimeter. Second, the spatial distribution of the physical dose was calculated by Monte Carlo code system PHITS: To verify of the accuracy of Monte Carlo calculation, and the calculation results were compared with experimental data of the ionization chamber. Last, to evaluate of the rate of the radical effect against the physical dose. The simulation results were compared with the measured depth-dose distribution and showed good agreement. The spatial distribution of a gel dose with threshold LET value of proton beam was calculated by the same simulation code. Then, the relative distribution of the radical effect was calculated from the physical dose and gel dose. The relative distribution of the radical effect was calculated at each depth as the quotient of relative dose obtained using physical and gel dose. The agreement between the relative distributions of the gel dosimeter and Radical effect was good at the proton beams.

  4. Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.

    2002-09-11

    The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less

  5. Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B.; Ibrahim, Ahmad M.

    2017-05-01

    This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less

  6. Program/Project Management Resources: A collection of 50 bibliographies focusing on continual improvement, reinventing government, and successful project management

    NASA Technical Reports Server (NTRS)

    Michaels, Jeffrey

    1994-01-01

    These Program/Project Management Resource Lists were originally written for the NASA project management community. Their purpose was to promote the use of the NASA Headquarters Library Program/Project Management Collection funded by NASA Headquarters Code FT, Training & Development Division, by offering introductions to the management topics studied by today's managers. Lists were also written at the request of NASA Headquarters Code T, Office of Continual improvements, and at the request of NASA members of the National Performance Review. This is the second edition of the compilation of these bibliographies; the first edition was printed in March 1994.

  7. Development of Monte Carlo based real-time treatment planning system with fast calculation algorithm for boron neutron capture therapy.

    PubMed

    Takada, Kenta; Kumada, Hiroaki; Liem, Peng Hong; Sakurai, Hideyuki; Sakae, Takeji

    2016-12-01

    We simulated the effect of patient displacement on organ doses in boron neutron capture therapy (BNCT). In addition, we developed a faster calculation algorithm (NCT high-speed) to simulate irradiation more efficiently. We simulated dose evaluation for the standard irradiation position (reference position) using a head phantom. Cases were assumed where the patient body is shifted in lateral directions compared to the reference position, as well as in the direction away from the irradiation aperture. For three groups of neutron (thermal, epithermal, and fast), flux distribution using NCT high-speed with a voxelized homogeneous phantom was calculated. The three groups of neutron fluxes were calculated for the same conditions with Monte Carlo code. These calculated results were compared. In the evaluations of body movements, there were no significant differences even with shifting up to 9mm in the lateral directions. However, the dose decreased by about 10% with shifts of 9mm in a direction away from the irradiation aperture. When comparing both calculations in the phantom surface up to 3cm, the maximum differences between the fluxes calculated by NCT high-speed with those calculated by Monte Carlo code for thermal neutrons and epithermal neutrons were 10% and 18%, respectively. The time required for NCT high-speed code was about 1/10th compared to Monte Carlo calculation. In the evaluation, the longitudinal displacement has a considerable effect on the organ doses. We also achieved faster calculation of depth distribution of thermal neutron flux using NCT high-speed calculation code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. Methodology comparison for gamma-heating calculations in material-testing reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A.

    2015-07-01

    The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physicalmore » models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear heating is represented by the physical quantity called absorbed dose (energy deposition induced by particle-matter interactions, divided by mass). Its calculation with Monte Carlo codes is possible but computationally expensive as it requires transport simulation of charged particles, along with neutrons and photons. For that reason, the calculation of another physical quantity, called KERMA, is often preferred, as KERMA calculation with Monte Carlo codes only requires transport of neutral particles. However, KERMA is only an estimator of the absorbed dose and many conditions must be fulfilled for KERMA to be equal to absorbed dose, including so-called condition of electronic equilibrium. Also, Monte Carlo computations of absorbed dose still present some physical approximations, even though there is only a limited number of them. Some of these approximations are linked to the way how Monte Carlo codes apprehend the transport simulation of charged particles and the productive and destructive interactions between photons, electrons and positrons. There exists a huge variety of electromagnetic shower models which tackle this topic. Differences in the implementation of these models can lead to discrepancies in calculated values of absorbed dose between different Monte Carlo codes. The magnitude of order of such potential discrepancies should be quantified for JHR gamma-heating calculations. We consequently present a two-pronged plan. In a first phase, we intend to perform compared absorbed dose / KERMA Monte Carlo calculations in the JHR. This way, we will study the presence or absence of electronic equilibrium in the different JHR structures and experimental devices and we will give recommendations for the choice of KERMA or absorbed dose when calculating gamma heating in the JHR. In a second phase, we intend to perform compared TRIPOLI4 / MCNP absorbed dose calculations in a simplified JHR-representative geometry. For this comparison, we will use the same nuclear data library for both codes (the European library JEFF3.1.1 and photon library EPDL97) so as to isolate the effects from electromagnetic shower models on absorbed dose calculation. This way, we hope to get insightful feedback on these models and their implementation in Monte Carlo codes. (authors)« less

  9. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMakin, A.H.; Cannon, S.D.; Finch, S.M.

    1992-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impactmore » on humans (dose estimates): Source terms, environmental transport, environmental monitoring data, demography, food consumption, and agriculture, and environmental pathways and dose estimates. Progress is discussed.« less

  10. Hanford Environmental Dose Reconstruction Project. Monthly report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMakin, A.H.; Cannon, S.D.; Finch, S.M.

    1992-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impactmore » on humans (dose estimates): Source terms, environmental transport, environmental monitoring data, demography, food consumption, and agriculture, and environmental pathways and dose estimates. Progress is discussed.« less

  11. Hanford Internal Dosimetry Project manual. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, E.H.; Bihl, D.E.; MacLellan, J.A.

    1994-07-01

    This document describes the Hanford Internal Dosimetry Project, as it is administered by Pacific Northwest Laboratory (PNL) in support of the US Department of Energy and its Hanford contractors. Project services include administrating the bioassay monitoring program, evaluating and documenting assessment of potential intakes and internal dose, ensuring that analytical laboratories conform to requirements, selecting and applying appropriate models and procedures for evaluating radionuclide deposition and the resulting dose, and technically guiding and supporting Hanford contractors in matters regarding internal dosimetry. Specific chapters deal with the following subjects: practices of the project, including interpretation of applicable DOE Orders, regulations, andmore » guidance into criteria for assessment, documentation, and reporting of doses; assessment of internal dose, including summary explanations of when and how assessments are performed; recording and reporting practices for internal dose; selection of workers for bioassay monitoring and establishment of type and frequency of bioassay measurements; capability and scheduling of bioassay monitoring services; recommended dosimetry response to potential internal exposure incidents; quality control and quality assurance provisions of the program.« less

  12. A preliminary Monte Carlo study for the treatment head of a carbon-ion radiotherapy facility using TOPAS

    NASA Astrophysics Data System (ADS)

    Liu, Hongdong; Zhang, Lian; Chen, Zhi; Liu, Xinguo; Dai, Zhongying; Li, Qiang; Xu, Xie George

    2017-09-01

    In medical physics it is desirable to have a Monte Carlo code that is less complex, reliable yet flexible for dose verification, optimization, and component design. TOPAS is a newly developed Monte Carlo simulation tool which combines extensive radiation physics libraries available in Geant4 code, easyto-use geometry and support for visualization. Although TOPAS has been widely tested and verified in simulations of proton therapy, there has been no reported application for carbon ion therapy. To evaluate the feasibility and accuracy of TOPAS simulations for carbon ion therapy, a licensed TOPAS code (version 3_0_p1) was used to carry out a dosimetric study of therapeutic carbon ions. Results of depth dose profile based on different physics models have been obtained and compared with the measurements. It is found that the G4QMD model is at least as accurate as the TOPAS default BIC physics model for carbon ions, but when the energy is increased to relatively high levels such as 400 MeV/u, the G4QMD model shows preferable performance. Also, simulations of special components used in the treatment head at the Institute of Modern Physics facility was conducted to investigate the Spread-Out dose distribution in water. The physical dose in water of SOBP was found to be consistent with the aim of the 6 cm ridge filter.

  13. Monte Carlo Shielding Comparative Analysis Applied to TRIGA HEU and LEU Spent Fuel Transport

    NASA Astrophysics Data System (ADS)

    Margeanu, C. A.; Margeanu, S.; Barbos, D.; Iorgulis, C.

    2010-12-01

    The paper is a comparative study of LEU and HEU fuel utilization effects for the shielding analysis during spent fuel transport. A comparison against the measured data for HEU spent fuel, available from the last stage of spent fuel repatriation fulfilled in the summer of 2008, is also presented. All geometrical and material data for the shipping cask were considered according to NAC-LWT Cask approved model. The shielding analysis estimates radiation doses to shipping cask wall surface, and in air at 1 m and 2 m, respectively, from the cask, by means of 3D Monte Carlo MORSE-SGC code. Before loading into the shipping cask, TRIGA spent fuel source terms and spent fuel parameters have been obtained by means of ORIGEN-S code. Both codes are included in ORNL's SCALE 5 programs package. The actinides contribution to total fuel radioactivity is very low in HEU spent fuel case, becoming 10 times greater in LEU spent fuel case. Dose rates for both HEU and LEU fuel contents are below regulatory limits, LEU spent fuel photon dose rates being greater than HEU ones. Comparison between HEU spent fuel theoretical and measured dose rates in selected measuring points shows a good agreement, calculated values being greater than the measured ones both to cask wall surface (about 34% relative difference) and in air at 1 m distance from cask surface (about 15% relative difference).

  14. Lithographically encoded polymer microtaggant using high-capacity and error-correctable QR code for anti-counterfeiting of drugs.

    PubMed

    Han, Sangkwon; Bae, Hyung Jong; Kim, Junhoi; Shin, Sunghwan; Choi, Sung-Eun; Lee, Sung Hoon; Kwon, Sunghoon; Park, Wook

    2012-11-20

    A QR-coded microtaggant for the anti-counterfeiting of drugs is proposed that can provide high capacity and error-correction capability. It is fabricated lithographically in a microfluidic channel with special consideration of the island patterns in the QR Code. The microtaggant is incorporated in the drug capsule ("on-dose authentication") and can be read by a simple smartphone QR Code reader application when removed from the capsule and washed free of drug. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. SU-F-19A-10: Recalculation and Reporting Clinical HDR 192-Ir Head and Neck Dose Distributions Using Model Based Dose Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsson Tedgren, A; Persson, M; Nilsson, J

    Purpose: To retrospectively re-calculate dose distributions for selected head and neck cancer patients, earlier treated with HDR 192Ir brachytherapy, using Monte Carlo (MC) simulations and compare results to distributions from the planning system derived using TG43 formalism. To study differences between dose to medium (as obtained with the MC code) and dose to water in medium as obtained through (1) ratios of stopping powers and (2) ratios of mass energy absorption coefficients between water and medium. Methods: The MC code Algebra was used to calculate dose distributions according to earlier actual treatment plans using anonymized plan data and CT imagesmore » in DICOM format. Ratios of stopping power and mass energy absorption coefficients for water with various media obtained from 192-Ir spectra were used in toggling between dose to water and dose to media. Results: Differences between initial planned TG43 dose distributions and the doses to media calculated by MC are insignificant in the target volume. Differences are moderate (within 4–5 % at distances of 3–4 cm) but increase with distance and are most notable in bone and at the patient surface. Differences between dose to water and dose to medium are within 1-2% when using mass energy absorption coefficients to toggle between the two quantities but increase to above 10% for bone using stopping power ratios. Conclusion: MC predicts target doses for head and neck cancer patients in close agreement with TG43. MC yields improved dose estimations outside the target where a larger fraction of dose is from scattered photons. It is important with awareness and a clear reporting of absorbed dose values in using model based algorithms. Differences in bone media can exceed 10% depending on how dose to water in medium is defined.« less

  16. Solutions for Digital Video Transmission Technology Final Report CRADA No. TC02068.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, A. T.; Rivers, W.

    This Project aimed at development of software for seismic data processing based on the Geotool code developed by the American company Multimax., Inc. The Geotool code was written in early 90-es for the UNIX platform. Under Project# 2821, functions of the old Geotool code were transferred into a commercial version for the Microsoft XP and Vista platform with addition of new capabilities on visualization and data processing. The developed new version of the Geotool+ was implemented using the up-to-date tool Microsoft Visual Studio 2005 and uses capabilities of the .NET platform. C++ was selected as the main programming language formore » the Geotool+. The two-year Project was extended by six months and funding levels increased from 600,000 to $670,000. All tasks were successfully completed and all deliverables were met for the project even though both the industrial partner and LLNL principal investigator left the project before its final report.« less

  17. Total Ionizing Dose Test of Microsemi's Silicon Switching Transistors JANTXV2N2222AUB and 2N2907AUB

    NASA Technical Reports Server (NTRS)

    Campola, M.; Freeman, B.; Yau, K.

    2017-01-01

    Microsemi's silicon switching transistors, JANTXV2N2222AUB and 2N2907AUB, were tested for total ionizing dose (TID) response beginning on July 11, 2016. This test served as the radiation lot acceptance test (RLAT) for the lot date code (LDC) tested. Low dose rate (LDR) irradiations were performed in this test so that the device susceptibility to enhanced low dose rate sensitivity (ELDRS) could be determined.

  18. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  19. SU-E-T-424: Dosimetric Verification of Modulated Electron Radiation Therapy Delivered Using An Electron Specific Multileaf Collimator for Treatment of Scalp Cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eldib, A; Al-Azhar University Cairo; Jin, L

    2014-06-01

    Purpose: Modulated electron radiotherapy (MERT) has the potential to achieve better treatment outcome for shallow tumors such as those of breast and scalp. In a separate study with scalp lesions, MERT was compared to volumetric modulated arc therapy. Our results showed a reduction in the dose reaching the brain with MERT. However dose calculation accuracy and delivery efficiency challenges remain. Thus in the current study we proceed to add more cases to demonstrate MERT beneficial outcome and its delivery accuracy using an electron specific multileaf collimator (eMLC). Methods: We have used the MCBEAM code for treatment head simulation and formore » generating phase space files to be used as radiation source input for our Monte Carlo based treatment planning system (MC TPS). MCPLAN code is used for calculation of patient specific dose deposition coefficient and for final MERT plan dose calculation. An in-house developed optimization code is used for the optimization process. MERT plans were generated for real patients and head and neck phantom. Film was used for dosimetric verification. The film was cut following the contour of the curved phantom surface and then sealed with black masking tape. In the measurement, the sealed film packet was sandwiched between two adjacent slabs of the head and neck phantom. The measured 2D dose distribution was then compared with calculations. Results: The eMLC allows effective treatment of scalps with multi-lesions spreading around the patient head, which was usually difficult to plan or very time consuming with conventional applicators. MERT continues to show better reduction in the brain dose. The dosimetric measurements showed slight discrepancy, which was attributed to the film setup. Conclusion: MERT can improve treatment plan quality for patients with scalp cancers. Our in-house MC TPS is capable of performing treatment planning and accurate dose calculation for MERT using the eMLC.« less

  20. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements

    NASA Astrophysics Data System (ADS)

    Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  1. Brachytherapy dosimetry of 125I and 103Pd sources using an updated cross section library for the MCNP Monte Carlo transport code.

    PubMed

    Bohm, Tim D; DeLuca, Paul M; DeWerd, Larry A

    2003-04-01

    Permanent implantation of low energy (20-40 keV) photon emitting radioactive seeds to treat prostate cancer is an important treatment option for patients. In order to produce accurate implant brachytherapy treatment plans, the dosimetry of a single source must be well characterized. Monte Carlo based transport calculations can be used for source characterization, but must have up to date cross section libraries to produce accurate dosimetry results. This work benchmarks the MCNP code and its photon cross section library for low energy photon brachytherapy applications. In particular, we calculate the emitted photon spectrum, air kerma, depth dose in water, and radial dose function for both 125I and 103Pd based seeds and compare to other published results. Our results show that MCNP's cross section library differs from recent data primarily in the photoelectric cross section for low energies and low atomic number materials. In water, differences as large as 10% in the photoelectric cross section and 6% in the total cross section occur at 125I and 103Pd photon energies. This leads to differences in the dose rate constant of 3% and 5%, and differences as large as 18% and 20% in the radial dose function for the 125I and 103Pd based seeds, respectively. Using a partially updated photon library, calculations of the dose rate constant and radial dose function agree with other published results. Further, the use of the updated photon library allows us to verify air kerma and depth dose in water calculations performed using MCNP's perturbation feature to simulate updated cross sections. We conclude that in order to most effectively use MCNP for low energy photon brachytherapy applications, we must update its cross section library. Following this update, the MCNP code system will be a very effective tool for low energy photon brachytherapy dosimetry applications.

  2. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements.

    PubMed

    Tessonnier, T; Mairani, A; Brons, S; Sala, P; Cerutti, F; Ferrari, A; Haberer, T; Debus, J; Parodi, K

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4 He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  3. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.

    PubMed

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M

    2009-09-30

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  4. [Absorbed doses to critical organs from full mouth dental radiography].

    PubMed

    Zhang, G; Yasuhiko, O; Hidegiko, Y

    1999-01-01

    A few studies were reported in China on radiological risk of dental radiography. The aim of this study is to evaluate the absorbed doses of patients from the full mouth radiographs, and to find out the contribution from each projection to the total absorbed dose of the organs. Absorbed doses to critical organs were measured from 14-film complete dental radiography. The organs included pituitary, optical lens, parotid glands, submandibular glands, sublingual glands, thyroid, breasts, ovary, testes and the skin in center field of each projection were studied. A-radiation analog dosimetry system (RANDO) phantom with thermoluminescent dosimeters (ILD200) was used for the study. All of the exposure parameters were fixed. The total filtration was 2 mm Al equivalent. The column collaboration was 6 cm in diameter and 20 cm in length. The absorbed doses of organs were measured three times in each projection of the full-mouth series (FMS) exposures. The absorbed dose of lenses in FMS (249 microGy) in present study was much less (10%) than the doses (2,630 microGy) reported in 1976. The doses absorbed of other organs in the present study were thyroid gland (125 microGy), pituitary gland (112 microGy), parotid gland (153 microGy), submandibular gland (629 microGy), sublingual gland (1,900 microGy), and breast gland (12 microGy). The doses of the ovary and testis were to small to further analysis. All of the results show that the radiation risk to patients in intraoral radiograph has been reduced significantly. In the pituitary, half of the dose is from both sides of the maxillary molar projection. For the lenses, the largest contribultions of radiation (60%) come from the ipsilateral molar and premolar projection of maxilla. In parotid gland, up to 57% of the dose is from the contralateral molar, pre-molar and canine of maxilla. It could be derived that about 90% of the absorbed doses could be avoided in FMS if the column collimator is 20 cm long and the filter is 2.0 mm thick. If we use the 10-film complete mouth radiograph instead of the 14-film series, more 20% of the doses would be reduced.

  5. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  6. Final Radiological Assessment of External Exposure for CLEAR-Line Americium Recovery Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Adam C.; Belooussova, Olga N.; Hetrick, Lucas Duane

    2014-11-12

    Los Alamos National Laboratory is currently planning to implement an americium recovery program. The americium, ordinarily isotopically pure 241Am, would be extracted from existing Pu materials, converted to an oxide and shipped to support fabrication of americium oxide-beryllium neutron sources. These operations would occur in the currently proposed Chloride Extraction and Actinide Recovery (CLEAR) line of glove boxes. This glove box line would be collocated with the currently-operational Experimental Chloride Extraction Line (EXCEL). The focus of this document is to provide an in-depth assessment of the currently planned radiation protection measures and to determine whether or not further design workmore » is required to satisfy design-goal and ALARA requirements. Further, this document presents a history of americium recovery operations in the Department of Energy and high-level descriptions of the CLEAR line operations to provide a basis of comparison. Under the working assumptions adopted by this study, it was found that the evaluated design appears to mitigate doses to a level that satisfies the ALARA-in-design requirements of 10 CFR 835 as implemented by the Los Alamos National Laboratory procedure P121. The analyses indicate that extremity doses would also meet design requirements. Dose-rate calculations were performed using the radiation transport code MCNP5 and doses were estimated using a time-motion study developed in consort with the subject matter expert. A copy of this report and all supporting documentation are located on the Radiological Engineering server at Y:\\Rad Engineering\\2013 PROJECTS\\TA-55 Clear Line.« less

  7. A spatially encoded dose difference maximal intensity projection map for patient dose evaluation: A new first line patient quality assurance tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu Weigang; Graff, Pierre; Boettger, Thomas

    2011-04-15

    Purpose: To develop a spatially encoded dose difference maximal intensity projection (DD-MIP) as an online patient dose evaluation tool for visualizing the dose differences between the planning dose and dose on the treatment day. Methods: Megavoltage cone-beam CT (MVCBCT) images acquired on the treatment day are used for generating the dose difference index. Each index is represented by different colors for underdose, acceptable, and overdose regions. A maximal intensity projection (MIP) algorithm is developed to compress all the information of an arbitrary 3D dose difference index into a 2D DD-MIP image. In such an algorithm, a distance transformation is generatedmore » based on the planning CT. Then, two new volumes representing the overdose and underdose regions of the dose difference index are encoded with the distance transformation map. The distance-encoded indices of each volume are normalized using the skin distance obtained on the planning CT. After that, two MIPs are generated based on the underdose and overdose volumes with green-to-blue and green-to-red lookup tables, respectively. Finally, the two MIPs are merged with an appropriate transparency level and rendered in planning CT images. Results: The spatially encoded DD-MIP was implemented in a dose-guided radiotherapy prototype and tested on 33 MVCBCT images from six patients. The user can easily establish the threshold for the overdose and underdose. A 3% difference between the treatment and planning dose was used as the threshold in the study; hence, the DD-MIP shows red or blue color for the dose difference >3% or {<=}3%, respectively. With such a method, the overdose and underdose regions can be visualized and distinguished without being overshadowed by superficial dose differences. Conclusions: A DD-MIP algorithm was developed that compresses information from 3D into a single or two orthogonal projections while hinting the user whether the dose difference is on the skin surface or deeper.« less

  8. Final report for project "Effects of Low-Dose Irradiation on NFkB Signaling Networks and Mitochondria"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woloschak, Gayle E; Grdina, David; Li, Jian-Jian

    Low dose ionizing radiation effects are difficult to study in human population because of the numerous confounding factors such as genetic and lifestyle differences. Research in mammalian model systems and in vitro is generally used in order to overcome this difficulty. In this program project three projects have joined together to investigate effects of low doses of ionizing radiation. These are doses at and below 10 cGy of low linear energy transfer ionizing radiation such as X-ray and gamma rays. This project was focused on cellular signaling associated with nuclear factor kappa B (NFkB) and mitochondria - subcellular organelles criticalmore » for cell aging and aging-like changes induced by ionizing radiation. In addition to cells in culture this project utilized animal tissues accumulated in a radiation biology tissue archive housed at Northwestern University (http://janus.northwestern.edu/janus2/index.php). Major trust of Project 1 was to gather all of the DoE sponsored irradiated animal (mouse, rat and dog) data and tissues under one roof and investigate mitochondrial DNA changes and micro RNA changes in these samples. Through comparison of different samples we were trying to delineate mitochondrial DNA quantity alterations and micro RNA expression differences associated with different doses and dose rates of radiation. Historic animal irradiation experiments sponsored by DoE were done in several national laboratories and universities between 1950’s and 1990’s; while these experiments were closed data and tissues were released to Project 1. Project 2 used cells in culture to investigate effects that low doses or radiation have on NFκB and its target genes manganese superoxide dismutase (MnSOD) and genes involved in cell cycle: Cyclins (B1 and D1) and cyclin dependent kinases (CDKs). Project 3 used cells in culture such as “normal” human cells (breast epithelial cell line MCF10A cells and skin keratinocyte cells HK18) and mouse embryo fibroblast (mef) cells to focus on role of NFkB protein and several other proteins such as survivin (BIRC5) in radiation dependent regulation of tumor necrosis factor alpha (TNFα) and its downstream signaling.« less

  9. Radiological Studies for the LCLS Beam Abort System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santana Leitner, M.; Vollaire, J.; Mao, X.S.

    2008-03-25

    The Linac Coherent Light Source (LCLS), a pioneer hard x-ray free electron laser is currently under construction at the Stanford Linear Accelerator Center. It is expected that by 2009 LCLS will deliver laser pulses of unprecedented brightness and short length, which will be used in several forefront research applications. This ambitious project encompasses major design challenges to the radiation protection like the numerous sources and the number of surveyed objects. In order to sort those, the showers from various loss sources have been tracked along a detailed model covering 1/2 mile of LCLS accelerator by means of the Monte Carlomore » intra nuclear cascade codes FLUKA and MARS15. This article covers the FLUKA studies of heat load; prompt and residual dose and environmental impact for the LCLS beam abort system.« less

  10. Albany v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew; Phipps, Eric; Ostien, Jakob

    2016-01-13

    The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less

  11. Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.

    PubMed

    Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao

    2012-08-01

    Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.

  12. Optimized image acquisition for breast tomosynthesis in projection and reconstruction space.

    PubMed

    Chawla, Amarpreet S; Lo, Joseph Y; Baker, Jay A; Samei, Ehsan

    2009-11-01

    Breast tomosynthesis has been an exciting new development in the field of breast imaging. While the diagnostic improvement via tomosynthesis is notable, the full potential of tomosynthesis has not yet been realized. This may be attributed to the dependency of the diagnostic quality of tomosynthesis on multiple variables, each of which needs to be optimized. Those include dose, number of angular projections, and the total angular span of those projections. In this study, the authors investigated the effects of these acquisition parameters on the overall diagnostic image quality of breast tomosynthesis in both the projection and reconstruction space. Five mastectomy specimens were imaged using a prototype tomosynthesis system. 25 angular projections of each specimen were acquired at 6.2 times typical single-view clinical dose level. Images at lower dose levels were then simulated using a noise modification routine. Each projection image was supplemented with 84 simulated 3 mm 3D lesions embedded at the center of 84 nonoverlapping ROIs. The projection images were then reconstructed using a filtered backprojection algorithm at different combinations of acquisition parameters to investigate which of the many possible combinations maximizes the performance. Performance was evaluated in terms of a Laguerre-Gauss channelized Hotelling observer model-based measure of lesion detectability. The analysis was also performed without reconstruction by combining the model results from projection images using Bayesian decision fusion algorithm. The effect of acquisition parameters on projection images and reconstructed slices were then compared to derive an optimization rule for tomosynthesis. The results indicated that projection images yield comparable but higher performance than reconstructed images. Both modes, however, offered similar trends: Performance improved with an increase in the total acquisition dose level and the angular span. Using a constant dose level and angular span, the performance rolled off beyond a certain number of projections, indicating that simply increasing the number of projections in tomosynthesis may not necessarily improve its performance. The best performance for both projection images and tomosynthesis slices was obtained for 15-17 projections spanning an angular are of approximately 45 degrees--the maximum tested in our study, and for an acquisition dose equal to single-view mammography. The optimization framework developed in this framework is applicable to other reconstruction techniques and other multiprojection systems.

  13. 78 FR 54635 - Notice of Intent To Prepare an Environmental Impact Statement for EA-18G Growler Airfield...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-05

    ... the potential environmental effects associated with the introduction of two additional EA-18G Growler... CONTACT: EA-18G EIS Project Manager (Code EV21/ SS); Naval Facilities Engineering Command (NAVFAC... request should be submitted to: EA-18G EIS Project Manager (Code EV21/SS); Naval Facilities Engineering...

  14. 77 FR 60687 - Record of Decision for the U.S. Marine Corps Basewide Water Infrastructure Project at Marine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ... Water Infrastructure Project at Marine Corps Base Camp Pendleton, California AGENCY: Department of the... Environmental Policy Act (NEPA) of 1969, 42 United States Code (U.S.C.) Section 4332(2)(c), the regulations of the Council on Environmental Quality for Implementing the Procedural Provisions of NEPA (40 Code of...

  15. Hanford Environmental Dose Reconstruction Project monthly report, November 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, S.D.; Finch, S.M.

    1992-12-31

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed members representing the states of Oregon, Washington. and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks: Source terms; environmental transport; environmental monitoring data; demography, food consumption and agriculture; environmentalmore » pathways and dose estimates.« less

  16. Hanford Environmental Dose Reconstruction Project monthly report, November 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, S.D.; Finch, S.M.

    1992-01-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed members representing the states of Oregon, Washington. and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks: Source terms; environmental transport; environmental monitoring data; demography, food consumption and agriculture; environmentalmore » pathways and dose estimates.« less

  17. Conversion and correction factors for historical measurements of iodine-131 in Hanford-area vegetation, 1945--1947. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mart, E.I.; Denham, D.H.; Thiede, M.E.

    1993-12-01

    This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project whose goal is to estimate the radiation dose that individuals could have received from emissions since 1944 at the U.S. Department of Energy`s (DOE) Hanford Site near Richland, Washington. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories (BNW). One of the radionuclides emitted that would affect the radiation dose was iodine-131. This report describes in detail the reconstructed conversion and correction factors for historical measurements of iodine-131 in Hanford-area vegetation which was collected from the beginning of October 1945 through the end of December 1947.

  18. Roadway contributing factors in traffic crashes.

    DOT National Transportation Integrated Search

    2014-09-01

    This project involved an evaluation of the codes which relate to roadway contributing : factors. This included a review of relevant codes used in other states. Crashes with related : codes were summarized and analyzed. A sample of crash sites was ins...

  19. MOCCA code for star cluster simulation: comparison with optical observations using COCOA

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Olech, Arkadiusz; Hypki, Arkadiusz

    2016-02-01

    We introduce and present preliminary results from COCOA (Cluster simulatiOn Comparison with ObservAtions) code for a star cluster after 12 Gyr of evolution simulated using the MOCCA code. The COCOA code is being developed to quickly compare results of numerical simulations of star clusters with observational data. We use COCOA to obtain parameters of the projected cluster model. For comparison, a FITS file of the projected cluster was provided to observers so that they could use their observational methods and techniques to obtain cluster parameters. The results show that the similarity of cluster parameters obtained through numerical simulations and observations depends significantly on the quality of observational data and photometric accuracy.

  20. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  1. User systems guidelines for software projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abrahamson, L.

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  2. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  3. Radiation transport calculations for cosmic radiation.

    PubMed

    Endo, A; Sato, T

    2012-01-01

    The radiation environment inside and near spacecraft consists of various components of primary radiation in space and secondary radiation produced by the interaction of the primary radiation with the walls and equipment of the spacecraft. Radiation fields inside astronauts are different from those outside them, because of the body's self-shielding as well as the nuclear fragmentation reactions occurring in the human body. Several computer codes have been developed to simulate the physical processes of the coupled transport of protons, high-charge and high-energy nuclei, and the secondary radiation produced in atomic and nuclear collision processes in matter. These computer codes have been used in various space radiation protection applications: shielding design for spacecraft and planetary habitats, simulation of instrument and detector responses, analysis of absorbed doses and quality factors in organs and tissues, and study of biological effects. This paper focuses on the methods and computer codes used for radiation transport calculations on cosmic radiation, and their application to the analysis of radiation fields inside spacecraft, evaluation of organ doses in the human body, and calculation of dose conversion coefficients using the reference phantoms defined in ICRP Publication 110. Copyright © 2012. Published by Elsevier Ltd.

  4. Organ dose conversion coefficients based on a voxel mouse model and MCNP code for external photon irradiation.

    PubMed

    Zhang, Xiaomin; Xie, Xiangdong; Cheng, Jie; Ning, Jing; Yuan, Yong; Pan, Jie; Yang, Guoshan

    2012-01-01

    A set of conversion coefficients from kerma free-in-air to the organ absorbed dose for external photon beams from 10 keV to 10 MeV are presented based on a newly developed voxel mouse model, for the purpose of radiation effect evaluation. The voxel mouse model was developed from colour images of successive cryosections of a normal nude male mouse, in which 14 organs or tissues were segmented manually and filled with different colours, while each colour was tagged by a specific ID number for implementation of mouse model in Monte Carlo N-particle code (MCNP). Monte Carlo simulation with MCNP was carried out to obtain organ dose conversion coefficients for 22 external monoenergetic photon beams between 10 keV and 10 MeV under five different irradiation geometries conditions (left lateral, right lateral, dorsal-ventral, ventral-dorsal, and isotropic). Organ dose conversion coefficients were presented in tables and compared with the published data based on a rat model to investigate the effect of body size and weight on the organ dose. The calculated and comparison results show that the organ dose conversion coefficients varying the photon energy exhibits similar trend for most organs except for the bone and skin, and the organ dose is sensitive to body size and weight at a photon energy approximately <0.1 MeV.

  5. A boundary-representation method for designing whole-body radiation dosimetry models: pregnant females at the ends of three gestational periods—RPI-P3, -P6 and -P9

    NASA Astrophysics Data System (ADS)

    Xu, X. George; Taranenko, Valery; Zhang, Juying; Shi, Chengyu

    2007-12-01

    Fetuses are extremely radiosensitive and the protection of pregnant females against ionizing radiation is of particular interest in many health and medical physics applications. Existing models of pregnant females relied on simplified anatomical shapes or partial-body images of low resolutions. This paper reviews two general types of solid geometry modeling: constructive solid geometry (CSG) and boundary representation (BREP). It presents in detail a project to adopt the BREP modeling approach to systematically design whole-body radiation dosimetry models: a pregnant female and her fetus at the ends of three gestational periods of 3, 6 and 9 months. Based on previously published CT images of a 7-month pregnant female, the VIP-Man model and mesh organ models, this new set of pregnant female models was constructed using 3D surface modeling technologies instead of voxels. The organ masses were adjusted to agree with the reference data provided by the International Commission on Radiological Protection (ICRP) and previously published papers within 0.5%. The models were then voxelized for the purpose of performing dose calculations in identically implemented EGS4 and MCNPX Monte Carlo codes. The agreements of the fetal doses obtained from these two codes for this set of models were found to be within 2% for the majority of the external photon irradiation geometries of AP, PA, LAT, ROT and ISO at various energies. It is concluded that the so-called RPI-P3, RPI-P6 and RPI-P9 models have been reliably defined for Monte Carlo calculations. The paper also discusses the needs for future research and the possibility for the BREP method to become a major tool in the anatomical modeling for radiation dosimetry.

  6. Methods of space radiation dose analysis with applications to manned space systems

    NASA Technical Reports Server (NTRS)

    Langley, R. W.; Billings, M. P.

    1972-01-01

    The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.

  7. Antipyretic potential of herbal coded formulation (Pyrexol).

    PubMed

    Khan, Muhammad Sajid; Hamid, Abdul; Akram, Muhammad; Mustafa, Sodah Bint; Sami, Abdul; Shah, Syed Muhammad Ali; Usmanghani, Khan

    2017-01-01

    The antipyretic effect of the aqueous extract of herbal coded formulation containing equal amount of Salix alba, Emblica officinalis, Glycyrrhiza glabra, Adhatoda vasica, Viola odorata, Thea sinensis, Veleriana officinalis, Foeniculum vulgare, Sisymbrium irrio and Achillea millefolium was investigated using the yeast induced pyrexia model in rabbits. Paracetamol was used as a control group. Rectal temperatures of all rabbits were recorded immediately before the administration of the extract or paracetamol and again at 1 hour, after this, temperature was noted at 1 hrs interval for 5 hrs using digital thermometer. At 240mg/kg dose the extract showed significant reduction in yeast-induced elevated temperature as compared with that of standard drug paracetamol (150mg/kg). It is concluded that herbal coded medicine at a dose of 240mg/kg has marked antipyretic activity in animal models and this strongly supports the ethno pharmacological uses of medicinal plants of this formulation.

  8. Neutron skyshine from intense 14-MeV neutron source facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, T.; Hayashi, K.; Takahashi, A.

    1985-07-01

    The dose distribution and the spectrum variation of neutrons due to the skyshine effect have been measured with the high-efficiency rem counter, the multisphere spectrometer, and the NE-213 scintillator in the environment surrounding an intense 14-MeV neutron source facility. The dose distribution and the energy spectra of neutrons around the facility used as a skyshine source have also been measured to enable the absolute evaluation of the skyshine effect. The skyshine effect was analyzed by two multigroup Monte Carlo codes, NIMSAC and MMCR-2, by two discrete ordinates S /sub n/ codes, ANISN and DOT3.5, and by the shield structure designmore » code for skyshine, SKYSHINE-II. The calculated results show good agreement with the measured results in absolute values. These experimental results should be useful as benchmark data for shyshine analysis and for shielding design of fusion facilities.« less

  9. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, R; Qin, N; Jiang, S

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relativemore » Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.« less

  10. AlleleCoder: a PERL script for coding codominant polymorphism data for PCA analysis

    USDA-ARS?s Scientific Manuscript database

    A useful biological interpretation of diploid heterozygotes is in terms of the dose of the common allele (0, 1 or 2 copies). We have developed a PERL script that converts FASTA files into coded spreadsheets suitable for Principal Component Analysis (PCA). In combination with R and R Commander, two- ...

  11. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  12. Multi-Constraint Multi-Variable Optimization of Source-Driven Nuclear Systems

    NASA Astrophysics Data System (ADS)

    Watkins, Edward Francis

    1995-01-01

    A novel approach to the search for optimal designs of source-driven nuclear systems is investigated. Such systems include radiation shields, fusion reactor blankets and various neutron spectrum-shaping assemblies. The novel approach involves the replacement of the steepest-descents optimization algorithm incorporated in the code SWAN by a significantly more general and efficient sequential quadratic programming optimization algorithm provided by the code NPSOL. The resulting SWAN/NPSOL code system can be applied to more general, multi-variable, multi-constraint shield optimization problems. The constraints it accounts for may include simple bounds on variables, linear constraints, and smooth nonlinear constraints. It may also be applied to unconstrained, bound-constrained and linearly constrained optimization. The shield optimization capabilities of the SWAN/NPSOL code system is tested and verified in a variety of optimization problems: dose minimization at constant cost, cost minimization at constant dose, and multiple-nonlinear constraint optimization. The replacement of the optimization part of SWAN with NPSOL is found feasible and leads to a very substantial improvement in the complexity of optimization problems which can be efficiently handled.

  13. Human exposure to large solar particle events in space

    NASA Technical Reports Server (NTRS)

    Townsend, L. W.; Wilson, J. W.; Shinn, J. L.; Curtis, S. B.

    1992-01-01

    Whenever energetic solar protons produced by solar particle events traverse bulk matter, they undergo various nuclear and atomic collision processes which significantly alter the physical characteristics and biologically important properties of their transported radiation fields. These physical interactions and their effect on the resulting radiation field within matter are described within the context of a recently developed deterministic, coupled neutron-proton space radiation transport computer code (BRYNTRN). Using this computer code, estimates of human exposure in interplanetary space, behind nominal (2 g/sq cm) and storm shelter (20 g/sq cm) thicknesses of aluminum shielding, are made for the large solar proton event of August 1972. Included in these calculations are estimates of cumulative exposures to the skin, ocular lens, and bone marrow as a function of time during the event. Risk assessment in terms of absorbed dose and dose equivalent is discussed for these organs. Also presented are estimates of organ exposures for hypothetical, worst-case flare scenarios. The rate of dose equivalent accumulation places this situation in an interesting region of dose rate between the very low values of usual concern in terrestrial radiation environments and the high-dose-rate values prevalent in radiation therapy.

  14. A dose assessment method for arbitrary geometries with virtual reality in the nuclear facilities decommissioning

    NASA Astrophysics Data System (ADS)

    Chao, Nan; Liu, Yong-kuo; Xia, Hong; Ayodeji, Abiodun; Bai, Lu

    2018-03-01

    During the decommissioning of nuclear facilities, a large number of cutting and demolition activities are performed, which results in a frequent change in the structure and produce many irregular objects. In order to assess dose rates during the cutting and demolition process, a flexible dose assessment method for arbitrary geometries and radiation sources was proposed based on virtual reality technology and Point-Kernel method. The initial geometry is designed with the three-dimensional computer-aided design tools. An approximate model is built automatically in the process of geometric modeling via three procedures namely: space division, rough modeling of the body and fine modeling of the surface, all in combination with collision detection of virtual reality technology. Then point kernels are generated by sampling within the approximate model, and when the material and radiometric attributes are inputted, dose rates can be calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The effectiveness and accuracy of the proposed method was verified by means of simulations using different geometries and the dose rate results were compared with that derived from CIDEC code, MCNP code and experimental measurements.

  15. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    NASA Astrophysics Data System (ADS)

    Lee, C. C.; Lee, Y. J.; Tung, C. J.; Cheng, H. W.; Chao, T. C.

    2014-02-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R50%) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent Req,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively.

  16. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  17. Structured Light Based 3d Scanning for Specular Surface by the Combination of Gray Code and Phase Shifting

    NASA Astrophysics Data System (ADS)

    Zhang, Yujia; Yilmaz, Alper

    2016-06-01

    Surface reconstruction using coded structured light is considered one of the most reliable techniques for high-quality 3D scanning. With a calibrated projector-camera stereo system, a light pattern is projected onto the scene and imaged by the camera. Correspondences between projected and recovered patterns are computed in the decoding process, which is used to generate 3D point cloud of the surface. However, the indirect illumination effects on the surface, such as subsurface scattering and interreflections, will raise the difficulties in reconstruction. In this paper, we apply maximum min-SW gray code to reduce the indirect illumination effects of the specular surface. We also analysis the errors when comparing the maximum min-SW gray code and the conventional gray code, which justifies that the maximum min-SW gray code has significant superiority to reduce the indirect illumination effects. To achieve sub-pixel accuracy, we project high frequency sinusoidal patterns onto the scene simultaneously. But for specular surface, the high frequency patterns are susceptible to decoding errors. Incorrect decoding of high frequency patterns will result in a loss of depth resolution. Our method to resolve this problem is combining the low frequency maximum min-SW gray code and the high frequency phase shifting code, which achieves dense 3D reconstruction for specular surface. Our contributions include: (i) A complete setup of the structured light based 3D scanning system; (ii) A novel combination technique of the maximum min-SW gray code and phase shifting code. First, phase shifting decoding with sub-pixel accuracy. Then, the maximum min-SW gray code is used to resolve the ambiguity resolution. According to the experimental results and data analysis, our structured light based 3D scanning system enables high quality dense reconstruction of scenes with a small number of images. Qualitative and quantitative comparisons are performed to extract the advantages of our new combined coding method.

  18. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  19. Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment

    NASA Astrophysics Data System (ADS)

    Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony

    For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.

  20. Preliminary Analysis of the Multisphere Neutron Spectrometer

    NASA Technical Reports Server (NTRS)

    Goldhagen, P.; Kniss, T.; Wilson, J. W.; Singleterry, R. C.; Jones, I. W.; VanSteveninck, W.

    2003-01-01

    Crews working on present-day jet aircraft are a large occupationally exposed group with a relatively high average effective dose from galactic cosmic radiation. Crews of future high-speed commercial aircraft flying at higher altitudes would be even more exposed. To help reduce the significant uncertainties in calculations of such exposures, the Atmospheric Ionizing Radiation (AIR) Project, an international collaboration of 15 laboratories, made simultaneous radiation measurements with 14 instruments on five flights of a NASA ER-2 high-altitude aircraft. The primary AIR instrument was a highly sensitive extended-energy multisphere neutron spectrometer with lead and steel shells placed within the moderators of two of its 14 detectors to enhance response at high energies. Detector responses were calculated for neutrons and charged hadrons at energies up to 100 GeV using MCNPX. Neutron spectra were unfolded from the measured count rates using the new MAXED code. We have measured the cosmic-ray neutron spectrum (thermal to greater than 10 GeV), total neutron fluence rate, and neutron effective dose and dose equivalent rates and their dependence on altitude and geomagnetic cutoff. The measured cosmic-ray neutron spectra have almost no thermal neutrons, a large "evaporation" peak near 1 MeV and a second broad peak near 100 MeV which contributes about 69% of the neutron effective dose. At high altitude, geomagnetic latitude has very little effect on the shape of the spectrum, but it is the dominant variable affecting neutron fluence rate, which was 8 times higher at the northernmost measurement location than it was at the southernmost. The shape of the spectrum varied only slightly with altitude from 21 km down to 12 km (56 - 201 grams per square centimeter atmospheric depth), but was significantly different on the ground. In all cases, ambient dose equivalent was greater than effective dose for cosmic-ray neutrons.

  1. MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gholampourkashi, S; Cygler, J E.; The Ottawa Hospital Cancer Centre, Ottawa, ON

    Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetrymore » system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO), funded by the Ontario Research Fund Research Excellence program.« less

  2. SU-F-J-08: Quantitative SPECT Imaging of Ra-223 in a Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, J; Hobbs, R; Sgouros, G

    Purpose: Ra-223 therapy of prostate cancer bone metastases is being used to treat patients routinely. However, the absorbed dose distribution at the macroscopic and microscopic scales remains elusive, due to the inability to image the small activities injected. Accurate activity quantification through imaging is essential to calculate the absorbed dose in organs and sub-units in radiopharmaceutical therapy, enabling personalized absorbed dose-based treatment planning methodologies and more effective and optimal treatments. Methods: A 22 cm diameter by 20 cm long cylindrical phantom, containing a 3.52 cm diameter sphere, was used. A total of 2.01 MBq of Ra-223 was placed in themore » phantom with 177.6 kBq in the sphere. Images were acquired on a dual-head Siemens Symbia T16 gamma camera using three 20% full-width energy windows and centered at 84, 154, and 269 keV (120 projections, 360° rotation, 45 s per view). We have implemented reconstruction of Ra-223 SPECT projections using OS-EM (up to 20 iterations of 10 subsets) with compensation for attenuation using CT-based attenuation maps, collimator-detector response (CDR) (including septal penetration, scatter and Pb x-ray modeling), and scatter in the patient using the effective source scatter estimation (ESSE) method. The CDR functions and scatter kernels required for ESSE were computed using the SIMIND MC simulation code. All Ra-223 photon emissions as well as gamma rays from the daughters Rn-219 and Bi-211 were modeled. Results: The sensitivity of the camera in the three combined windows was 107.3 cps/MBq. The visual quality of the SPECT images was reasonably good and the activity in the sphere was 27% smaller than the true activity. This underestimation is likely due to partial volume effect. Conclusion: Absolute quantitative Ra-223 SPECT imaging is achievable with careful attention to compensate for image degrading factors and system calibration.« less

  3. Analysis of dose-LET distribution in the human body irradiated by high energy hadrons.

    PubMed

    Sato, T; Tsuda, S; Sakamoto, Y; Yamaguchi, Y; Niita, K

    2003-01-01

    For the purposes of radiological protection, it is important to analyse profiles of the particle field inside a human body irradiated by high energy hadrons, since they can produce a variety of secondary particles which play an important role in the energy deposition process, and characterise their radiation qualities. Therefore Monte Carlo calculations were performed to evaluate dose distributions in terms of the linear energy transfer of ionising particles (dose-LET distribution) using a newly developed particle transport code (Particle and Heavy Ion Transport code System, PHITS) for incidences of neutrons, protons and pions with energies from 100 MeV to 200 GeV. Based on these calculations, it was found that more than 80% and 90% of the total deposition energies are attributed to ionisation by particles with LET below 10 keV microm(-1) for the irradiations of neutrons and the charged particles, respectively.

  4. Shutdown Dose Rate Analysis for the long-pulse D-D Operation Phase in KSTAR

    NASA Astrophysics Data System (ADS)

    Park, Jin Hun; Han, Jung-Hoon; Kim, D. H.; Joo, K. S.; Hwang, Y. S.

    2017-09-01

    KSTAR is a medium size fully superconducting tokamak. The deuterium-deuterium (D-D) reaction in the KSTAR tokamak generates neutrons with a peak yield of 3.5x1016 per second through a pulse operation of 100 seconds. The effect of neutron generation from full D-D high power KSTAR operation mode to the machine, such as activation, shutdown dose rate, and nuclear heating, are estimated for an assurance of safety during operation, maintenance, and machine upgrade. The nuclear heating of the in-vessel components, and neutron activation of the surrounding materials have been investigated. The dose rates during operation and after shutdown of KSTAR have been calculated by a 3D CAD model of KSTAR with the Monte Carlo code MCNP5 (neutron flux and decay photon), the inventory code FISPACT (activation and decay photon) and the FENDL 2.1 nuclear data library.

  5. Review of Hybrid (Deterministic/Monte Carlo) Radiation Transport Methods, Codes, and Applications at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, John C; Peplow, Douglas E.; Mosher, Scott W

    2010-01-01

    This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(10{sup 2-4}), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less

  6. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  7. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    ERIC Educational Resources Information Center

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  8. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current Ramses model that should be further investigated. Several skills were required to be built up over the course of the internship project. First and foremost, my Simulink skills have improved drastically, as much of my experience had been modeling electronic circuits as opposed to software models. Furthermore, I am now comfortable working with the Simulink Auto-coder, a tool I had never used until this summer; this tool also tested my critical thinking and C++ knowledge as I had to interpret the C++ code it was generating and attempt to understand how the Simulink model affected the generated code. I had come into the internship with a solid understanding of Matlab code, but had done very little in using it to automate tasks, particularly Simulink tasks; along the same lines, I had rarely used shell script to automate and interface with programs, which I gained a fair amount of experience with this summer, including how to use regular expression. Lastly, soft-skills are an area everyone can continuously improve on; having never worked with NASA engineers, which to me seem to be a completely different breed than what I am used to (commercial electronic engineers), I learned to utilize the wealth of knowledge present at JSC. I wish I had come into the internship knowing exactly how helpful everyone in my branch would be, as I would have picked up on this sooner. I hope that having gained such a strong foundation in Simulink over this summer will open the opportunity to return to work on this project, or potentially other opportunities within the division. The idea of leaving a project I devoted ten weeks to is a hard one to cope with, so having the chance to pick up where I left off sounds appealing; alternatively, I am interested to see if there are any opening in the future that would allow me to work on a project that is more in-line with my research in estimation algorithms. Regardless, this summer has been a milestone in my professional career, and I hope this has started a long-term relationship between JSC and myself. I really enjoy the thought of building on my experience here over future summers while I work to complete my PhD at Missouri University of Science and Technology.

  9. RITRACKS: A Software for Simulation of Stochastic Radiation Track Structure, Micro and Nanodosimetry, Radiation Chemistry and DNA Damage for Heavy Ions

    NASA Technical Reports Server (NTRS)

    Plante, I; Wu, H

    2014-01-01

    The code RITRACKS (Relativistic Ion Tracks) has been developed over the last few years at the NASA Johnson Space Center to simulate the effects of ionizing radiations at the microscopic scale, to understand the effects of space radiation at the biological level. The fundamental part of this code is the stochastic simulation of radiation track structure of heavy ions, an important component of space radiations. The code can calculate many relevant quantities such as the radial dose, voxel dose, and may also be used to calculate the dose in spherical and cylindrical targets of various sizes. Recently, we have incorporated DNA structure and damage simulations at the molecular scale in RITRACKS. The direct effect of radiations is simulated by introducing a slight modification of the existing particle transport algorithms, using the Binary-Encounter-Bethe model of ionization cross sections for each molecular orbitals of DNA. The simulation of radiation chemistry is done by a step-by-step diffusion-reaction program based on the Green's functions of the diffusion equation]. This approach is also used to simulate the indirect effect of ionizing radiation on DNA. The software can be installed independently on PC and tablets using the Windows operating system and does not require any coding from the user. It includes a Graphic User Interface (GUI) and a 3D OpenGL visualization interface. The calculations are executed simultaneously (in parallel) on multiple CPUs. The main features of the software will be presented.

  10. 3D delivered dose assessment using a 4DCT-based motion model

    PubMed Central

    Cai, Weixing; Hurwitz, Martina H.; Williams, Christopher L.; Dhou, Salam; Berbeco, Ross I.; Seco, Joao; Mishra, Pankaj; Lewis, John H.

    2015-01-01

    Purpose: The purpose of this work is to develop a clinically feasible method of calculating actual delivered dose distributions for patients who have significant respiratory motion during the course of stereotactic body radiation therapy (SBRT). Methods: A novel approach was proposed to calculate the actual delivered dose distribution for SBRT lung treatment. This approach can be specified in three steps. (1) At the treatment planning stage, a patient-specific motion model is created from planning 4DCT data. This model assumes that the displacement vector field (DVF) of any respiratory motion deformation can be described as a linear combination of some basis DVFs. (2) During the treatment procedure, 2D time-varying projection images (either kV or MV projections) are acquired, from which time-varying “fluoroscopic” 3D images of the patient are reconstructed using the motion model. The DVF of each timepoint in the time-varying reconstruction is an optimized linear combination of basis DVFs such that the 2D projection of the 3D volume at this timepoint matches the projection image. (3) 3D dose distribution is computed for each timepoint in the set of 3D reconstructed fluoroscopic images, from which the total effective 3D delivered dose is calculated by accumulating deformed dose distributions. This approach was first validated using two modified digital extended cardio-torso (XCAT) phantoms with lung tumors and different respiratory motions. The estimated doses were compared to the dose that would be calculated for routine 4DCT-based planning and to the actual delivered dose that was calculated using “ground truth” XCAT phantoms at all timepoints. The approach was also tested using one set of patient data, which demonstrated the application of our method in a clinical scenario. Results: For the first XCAT phantom that has a mostly regular breathing pattern, the errors in 95% volume dose (D95) are 0.11% and 0.83%, respectively for 3D fluoroscopic images reconstructed from kV and MV projections compared to the ground truth, which is clinically comparable to 4DCT (0.093%). For the second XCAT phantom that has an irregular breathing pattern, the errors are 0.81% and 1.75% for kV and MV reconstructions, both of which are better than that of 4DCT (4.01%). In the case of real patient, although it is impossible to obtain the actual delivered dose, the dose estimation is clinically reasonable and demonstrates differences between 4DCT and MV reconstruction-based dose estimates. Conclusions: With the availability of kV or MV projection images, the proposed approach is able to assess delivered doses for all respiratory phases during treatment. Compared to the planning dose based on 4DCT, the dose estimation using reconstructed 3D fluoroscopic images was as good as 4DCT for regular respiratory pattern and was a better dose estimation for the irregular respiratory pattern. PMID:26127043

  11. 3D delivered dose assessment using a 4DCT-based motion model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Weixing; Hurwitz, Martina H.; Williams, Christopher L.

    Purpose: The purpose of this work is to develop a clinically feasible method of calculating actual delivered dose distributions for patients who have significant respiratory motion during the course of stereotactic body radiation therapy (SBRT). Methods: A novel approach was proposed to calculate the actual delivered dose distribution for SBRT lung treatment. This approach can be specified in three steps. (1) At the treatment planning stage, a patient-specific motion model is created from planning 4DCT data. This model assumes that the displacement vector field (DVF) of any respiratory motion deformation can be described as a linear combination of some basismore » DVFs. (2) During the treatment procedure, 2D time-varying projection images (either kV or MV projections) are acquired, from which time-varying “fluoroscopic” 3D images of the patient are reconstructed using the motion model. The DVF of each timepoint in the time-varying reconstruction is an optimized linear combination of basis DVFs such that the 2D projection of the 3D volume at this timepoint matches the projection image. (3) 3D dose distribution is computed for each timepoint in the set of 3D reconstructed fluoroscopic images, from which the total effective 3D delivered dose is calculated by accumulating deformed dose distributions. This approach was first validated using two modified digital extended cardio-torso (XCAT) phantoms with lung tumors and different respiratory motions. The estimated doses were compared to the dose that would be calculated for routine 4DCT-based planning and to the actual delivered dose that was calculated using “ground truth” XCAT phantoms at all timepoints. The approach was also tested using one set of patient data, which demonstrated the application of our method in a clinical scenario. Results: For the first XCAT phantom that has a mostly regular breathing pattern, the errors in 95% volume dose (D95) are 0.11% and 0.83%, respectively for 3D fluoroscopic images reconstructed from kV and MV projections compared to the ground truth, which is clinically comparable to 4DCT (0.093%). For the second XCAT phantom that has an irregular breathing pattern, the errors are 0.81% and 1.75% for kV and MV reconstructions, both of which are better than that of 4DCT (4.01%). In the case of real patient, although it is impossible to obtain the actual delivered dose, the dose estimation is clinically reasonable and demonstrates differences between 4DCT and MV reconstruction-based dose estimates. Conclusions: With the availability of kV or MV projection images, the proposed approach is able to assess delivered doses for all respiratory phases during treatment. Compared to the planning dose based on 4DCT, the dose estimation using reconstructed 3D fluoroscopic images was as good as 4DCT for regular respiratory pattern and was a better dose estimation for the irregular respiratory pattern.« less

  12. Low-delay predictive audio coding for the HIVITS HDTV codec

    NASA Astrophysics Data System (ADS)

    McParland, A. K.; Gilchrist, N. H. C.

    1995-01-01

    The status of work relating to predictive audio coding, as part of the European project on High Quality Video Telephone and HD(TV) Systems (HIVITS), is reported. The predictive coding algorithm is developed, along with six-channel audio coding and decoding hardware. Demonstrations of the audio codec operating in conjunction with the video codec, are given.

  13. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  14. Total Ionizing Dose Test Report BFR92A NPN 5 GHz Wide Band Transistor from NXP

    NASA Technical Reports Server (NTRS)

    Phan, Anthony M.; Oldham, Timothy R.

    2011-01-01

    The purpose of this test was to characterize the Philips/NXP BFR92A NPN 5 gigahertz wide band silicon transistor for total dose response. This test shall serves as the radiation lot acceptance test (RLAT) for the lot date code (LDC) 1027. The BFR92A is packaged in a 3-pin plastic SOT23 package. Low dose rate (LDR/ELDRS) irradiations was performed.

  15. Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth

    MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.

  16. Comparison of forward- and back-projection in vivo EPID dosimetry for VMAT treatment of the prostate

    NASA Astrophysics Data System (ADS)

    Bedford, James L.; Hanson, Ian M.; Hansen, Vibeke N.

    2018-01-01

    In the forward-projection method of portal dosimetry for volumetric modulated arc therapy (VMAT), the integrated signal at the electronic portal imaging device (EPID) is predicted at the time of treatment planning, against which the measured integrated image is compared. In the back-projection method, the measured signal at each gantry angle is back-projected through the patient CT scan to give a measure of total dose to the patient. This study aims to investigate the practical agreement between the two types of EPID dosimetry for prostate radiotherapy. The AutoBeam treatment planning system produced VMAT plans together with corresponding predicted portal images, and a total of 46 sets of gantry-resolved portal images were acquired in 13 patients using an iViewGT portal imager. For the forward-projection method, each acquisition of gantry-resolved images was combined into a single integrated image and compared with the predicted image. For the back-projection method, iViewDose was used to calculate the dose distribution in the patient for comparison with the planned dose. A gamma index for 3% and 3 mm was used for both methods. The results were investigated by delivering the same plans to a phantom and repeating some of the deliveries with deliberately introduced errors. The strongest agreement between forward- and back-projection methods is seen in the isocentric intensity/dose difference, with moderate agreement in the mean gamma. The strongest correlation is observed within a given patient, with less correlation between patients, the latter representing the accuracy of prediction of the two methods. The error study shows that each of the two methods has its own distinct sensitivity to errors, but that overall the response is similar. The forward- and back-projection EPID dosimetry methods show moderate agreement in this series of prostate VMAT patients, indicating that both methods can contribute to the verification of dose delivered to the patient.

  17. 76 FR 80907 - TRICARE Prime Urgent Care Demonstration Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-27

    ... DEPARTMENT OF DEFENSE Office of the Secretary TRICARE Prime Urgent Care Demonstration Project....S. Code, section 1092, entitled Department Of Defense TRICARE Prime Urgent Care Demonstration Project. The demonstration project is intended to test whether allowing four visits to an urgent care...

  18. Digital tomosynthesis mammography using a parallel maximum-likelihood reconstruction method

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Zhang, Juemin; Moore, Richard; Rafferty, Elizabeth; Kopans, Daniel; Meleis, Waleed; Kaeli, David

    2004-05-01

    A parallel reconstruction method, based on an iterative maximum likelihood (ML) algorithm, is developed to provide fast reconstruction for digital tomosynthesis mammography. Tomosynthesis mammography acquires 11 low-dose projections of a breast by moving an x-ray tube over a 50° angular range. In parallel reconstruction, each projection is divided into multiple segments along the chest-to-nipple direction. Using the 11 projections, segments located at the same distance from the chest wall are combined to compute a partial reconstruction of the total breast volume. The shape of the partial reconstruction forms a thin slab, angled toward the x-ray source at a projection angle 0°. The reconstruction of the total breast volume is obtained by merging the partial reconstructions. The overlap region between neighboring partial reconstructions and neighboring projection segments is utilized to compensate for the incomplete data at the boundary locations present in the partial reconstructions. A serial execution of the reconstruction is compared to a parallel implementation, using clinical data. The serial code was run on a PC with a single PentiumIV 2.2GHz CPU. The parallel implementation was developed using MPI and run on a 64-node Linux cluster using 800MHz Itanium CPUs. The serial reconstruction for a medium-sized breast (5cm thickness, 11cm chest-to-nipple distance) takes 115 minutes, while a parallel implementation takes only 3.5 minutes. The reconstruction time for a larger breast using a serial implementation takes 187 minutes, while a parallel implementation takes 6.5 minutes. No significant differences were observed between the reconstructions produced by the serial and parallel implementations.

  19. Heart Pump Design for Cleveland Clinic Foundation

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Through a Lewis CommTech Program project with the Cleveland Clinic Foundation, the NASA Lewis Research Center is playing a key role in the design and development of a permanently implantable, artificial heart pump assist device. Known as the Innovative Ventricular Assist System (IVAS), this device will take on the pumping role of the damaged left ventricle of the heart. The key part of the IVAS is a nonpulsatile (continuous flow) artificial heart pump with centrifugal impeller blades, driven by an electric motor. Lewis is part of an industry and academia team, led by the Ohio Aerospace Institute (OAI), that is working with the Cleveland Clinic Foundation to make IVAS a reality. This device has the potential to save tens of thousands of lives each year, since 80 percent of heart attack victims suffer irreversible damage to the left ventricle, the part of the heart that does most of the pumping. Impeller blade design codes and flow-modeling analytical codes will be used in the project. These codes were developed at Lewis for the aerospace industry but will be applicable to the IVAS design project. The analytical codes, which currently simulate the flow through the compressor and pump systems, will be used to simulate the flow within the blood pump in the artificial heart assist device. The Interdisciplinary Technology Office heads up Lewis' efforts in the IVAS project. With the aid of numerical modeling, the blood pump will address many design issues, including some fluid-dynamic design considerations that are unique to the properties of blood. Some of the issues that will be addressed in the design process include hemolysis, deposition, recirculation, pump efficiency, rotor thrust balance, and bearing lubrication. Optimum pumping system performance will be achieved by modeling all the interactions between the pump components. The interactions can be multidisciplinary and, therefore, are influenced not only by the fluid dynamics of adjacent components but also by thermal and structural effects. Lewis-developed flow-modeling codes to be used in the pump simulations will include a one-dimensional code and an incompressible three-dimensional Navier-Stokes flow code. These codes will analyze the prototype pump designed by the Cleveland Clinic Foundation. With an improved understanding of the flow phenomena within the prototype pump, design changes to improve the performance of the pump system can be verified by computer prior to fabrication in order to reduce risks. The use of Lewis flow modeling codes during the design and development process will improve pump system performance and reduce the number of prototypes built in the development phase. The first phase of the IVAS project is to fully develop the prototype in a laboratory environment that uses a water/glycerin mixture as the surrogate fluid to simulate blood. A later phase of the project will include testing in animals for final validation. Lewis will be involved in the IVAS project for 3 to 5 years.

  20. Micron MT29F128G08AJAAA 128GB Asynchronous Flash Memory Total Ionizing Dose Characterization Test Report

    NASA Technical Reports Server (NTRS)

    Campola, Michael; Wyrwas, Edward

    2017-01-01

    The purpose of this test was to characterize the Micron MT29F128G08AJAAAs parameter degradation for total dose response and to evaluate and compare lot date codes for sensitivity. In the test, the device was exposed to both low dose and high dose rate (HDR) irradiations using gamma radiation. Device parameters such as leakage currents, quantity of upset bits and overall chip and die health were investigated to determine which lot is more robust.

  1. SU-E-T-169: Evaluation of Oncentra TPS for Nasopharynx Brachy Using Patient Specific Voxel Phantom and EGSnrc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadad, K; Zoherhvand, M; Faghihi, R

    2014-06-01

    Purpose: Nasopharnx carcinoma (NPC) treatment is being carried out using Ir-192 HDR seeds in Mehdieh Hospital in Hamadan, Iran. The Oncentra™ TPS is based on optimized TG-43 formalism which disregards heterogeneity in the treatment area. Due to abundant heterogeneity in head and neck, comparison of the Oncentra™ TPS dose evaluation and an accurate dose calculation method in NPC brachytherapy is the objective of this study. Methods: CT DICOMs of a patient with NPC obtained from Mehdieh Hospital used to create 3D voxel phantom with CTCREATE utility of EGSnrc code package. The voxel phantom together with Ir-192 HDR brachytherapy source weremore » the input to DOSXYZnrc to calculate the 3D dose distribution. The sources were incorporate with type 6 source in DOSXYZnrc and their dwell times were taken into account in final dose calculations. Results: The direct comparison between isodoses as well as DVHs for the GTV, PTV and CTV obtained by Oncentra™ and EGSnrc Monte Carlo code are made. EGSnrc results are obtained using 5×10{sup 9} histories to reduce the statistical error below 1% in GTV and 5% in 5% dose areas. The standard ICRP700 cross section library is employed in DOSXYZnrc dose calculation. Conclusion: A direct relationship between increased dose differences and increased material density (hence heterogeneity) is observed when isodoses contours of the TPS and DOSXYZnrc are compared. Regarding the point dose calculations, the differences range from 1.2% in PTV to 5.6% for cavity region and 7.8% for bone regions. While Oncentra™ TPS overestimates the dose in cavities, it tends to underestimate dose depositions within bones.« less

  2. Monte Carlo investigation of backscatter point spread function for x-ray imaging examinations

    NASA Astrophysics Data System (ADS)

    Xiong, Zhenyu; Vijayan, Sarath; Rudin, Stephen; Bednarek, Daniel R.

    2017-03-01

    X-ray imaging examinations, especially complex interventions, may result in relatively high doses to the patient's skin inducing skin injuries. A method was developed to determine the skin-dose distribution for non-uniform x-ray beams by convolving the backscatter point-spread-function (PSF) with the primary-dose distribution to generate the backscatter distribution that, when added to the primary dose, gives the total-dose distribution. This technique was incorporated in the dose-tracking system (DTS), which provides a real-time color-coded 3D-mapping of skin dose during fluoroscopic procedures. The aim of this work is to investigate the variation of the backscatter PSF with different parameters. A backscatter PSF of a 1-mm x-ray beam was generated by EGSnrc Monte-Carlo code for different x-ray beam energies, different soft-tissue thickness above bone, different bone thickness and different entrance-beam angles, as well as for different locations on the SK-150 anthropomorphic head phantom. The results show a reduction of the peak scatter to primary dose ratio of 48% when X-ray beam voltage is increased from 40 keV to 120 keV. The backscatter dose was reduced when bone was beneath the soft tissue layer and this reduction increased with thinner soft tissue and thicker bone layers. The backscatter factor increased about 21% as the angle of incidence of the beam with the entrance surface decreased from 90° (perpendicular) to 30°. The backscatter PSF differed for different locations on the SK-150 phantom by up to 15%. The results of this study can be used to improve the accuracy of dose calculation when using PSF convolution in the DTS.

  3. Preliminary estimates of nucleon fluxes in a water target exposed to solar-flare protons: BRYNTRN versus Monte Carlo code

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Lone, M. A.; Wong, P. Y.; Costen, Robert C.

    1994-01-01

    A baryon transport code (BRYNTRN) has previously been verified using available Monte Carlo results for a solar-flare spectrum as the reference. Excellent results were obtained, but the comparisons were limited to the available data on dose and dose equivalent for moderate penetration studies that involve minor contributions from secondary neutrons. To further verify the code, the secondary energy spectra of protons and neutrons are calculated using BRYNTRN and LAHET (Los Alamos High-Energy Transport code, which is a Monte Carlo code). These calculations are compared for three locations within a water slab exposed to the February 1956 solar-proton spectrum. Reasonable agreement was obtained when various considerations related to the calculational techniques and their limitations were taken into account. Although the Monte Carlo results are preliminary, it appears that the neutron albedo, which is not currently treated in BRYNTRN, might be a cause for the large discrepancy seen at small penetration depths. It also appears that the nonelastic neutron production cross sections in BRYNTRN may underestimate the number of neutrons produced in proton collisions with energies below 200 MeV. The notion that the poor energy resolution in BRYNTRN may cause a large truncation error in neutron elastic scattering requires further study.

  4. MO-F-CAMPUS-I-02: Occupational Conceptus Doses From Fluoroscopically-Guided Interventional Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damilakis, J; Perisinakis, K; Solomou, G

    Purpose: The aim of this method was to provide dosimetric data on conceptus dose for the pregnant employee who participates in fluoroscopically-guided interventional procedures. Methods: Scattered air-kerma dose rates were obtained for 17 fluoroscopic projections involved in interventional procedures. These projections were simulated on an anthropomorphic phantom placed on the examination table supine. The operating theater was divided into two grids relative to the long table sides. Each grid consisted of 33 cells spaced 0.50 m apart. During the simulated exposures, at each cell, scatter air-kerma rate was measured at 110 cm from the floor i.e. at the height ofmore » the waist of the pregnant worker. Air-kerma rates were divided by the dose area product (DAP) rate of each exposure to obtain normalized data. For each projection, measurements were performed for 3 kVp and 3 filtration values i.e. for 9 different x-ray spectra. All measurements were performed by using a modern C-arm angiographic system (Siemens Axiom Artis, Siemens, Germany) and a radiation meter equipped with an ionization chamber. Results: The results consist of 153 iso-dose maps, which show the spatial distribution of DAP-normalized scattered air-kerma doses at the waist level of a pregnant worker. Conceptus dose estimation is possible using air-kerma to embryo/fetal dose conversion coefficients published in a previous study (J Cardiovasc Electrophysiol, Vol. 16, pp. 1–8, July 2005). Using these maps, occupationally exposed pregnant personnel may select a working position for a certain projection that keeps abdominal dose as low as reasonably achievable. Taking into consideration the regulatory conceptus dose limit for occupational exposure, determination of the maximum workload allowed for the pregnant personnel is also possible. Conclusion: Data produced in this work allow for the anticipation of conceptus dose and the determination of the maximum workload for a pregnant worker from any fluoroscopically-guided interventional procedure. This study was supported by the Greek Ministry of Education and Religious Affairs, General Secretariat for Research and Technology, Operational Program ‘Education and Lifelong Learning’, ARISTIA (Research project: CONCERT)« less

  5. A Monte Carlo calculation model of electronic portal imaging device for transit dosimetry through heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Jihyung; Jung, Jae Won, E-mail: jungj@ecu.edu; Kim, Jong Oh

    2016-05-15

    Purpose: To develop and evaluate a fast Monte Carlo (MC) dose calculation model of electronic portal imaging device (EPID) based on its effective atomic number modeling in the XVMC code. Methods: A previously developed EPID model, based on the XVMC code by density scaling of EPID structures, was modified by additionally considering effective atomic number (Z{sub eff}) of each structure and adopting a phase space file from the EGSnrc code. The model was tested under various homogeneous and heterogeneous phantoms and field sizes by comparing the calculations in the model with measurements in EPID. In order to better evaluate themore » model, the performance of the XVMC code was separately tested by comparing calculated dose to water with ion chamber (IC) array measurement in the plane of EPID. Results: In the EPID plane, calculated dose to water by the code showed agreement with IC measurements within 1.8%. The difference was averaged across the in-field regions of the acquired profiles for all field sizes and phantoms. The maximum point difference was 2.8%, affected by proximity of the maximum points to penumbra and MC noise. The EPID model showed agreement with measured EPID images within 1.3%. The maximum point difference was 1.9%. The difference dropped from the higher value of the code by employing the calibration that is dependent on field sizes and thicknesses for the conversion of calculated images to measured images. Thanks to the Z{sub eff} correction, the EPID model showed a linear trend of the calibration factors unlike those of the density-only-scaled model. The phase space file from the EGSnrc code sharpened penumbra profiles significantly, improving agreement of calculated profiles with measured profiles. Conclusions: Demonstrating high accuracy, the EPID model with the associated calibration system may be used for in vivo dosimetry of radiation therapy. Through this study, a MC model of EPID has been developed, and their performance has been rigorously investigated for transit dosimetry.« less

  6. Development of a patient-specific dosimetry estimation system in nuclear medicine examination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H. H.; Dong, S. L.; Yang, H. J.

    2011-07-01

    The purpose of this study is to develop a patient-specific dosimetry estimation system in nuclear medicine examination using a SimSET-based Monte Carlo code. We added a dose deposition routine to store the deposited energy of the photons during their flights in SimSET and developed a user-friendly interface for reading PET and CT images. Dose calculated on ORNL phantom was used to validate the accuracy of this system. The S values for {sup 99m}Tc, {sup 18}F and {sup 131}I obtained by the system were compared to those from the MCNP4C code and OLINDA. The ratios of S values computed by thismore » system to those obtained with OLINDA for various organs were ranged from 0.93 to 1.18, which are comparable to that obtained from MCNP4C code (0.94 to 1.20). The average ratios of S value were 0.99{+-}0.04, 1.03{+-}0.05, and 1.00{+-}0.07 for isotopes {sup 131}I, {sup 18}F, and {sup 99m}Tc, respectively. The simulation time of SimSET was two times faster than MCNP4C's for various isotopes. A 3D dose calculation was also performed on a patient data set with PET/CT examination using this system. Results from the patient data showed that the estimated S values using this system differed slightly from those of OLINDA for ORNL phantom. In conclusion, this system can generate patient-specific dose distribution and display the isodose curves on top of the anatomic structure through a friendly graphic user interface. It may also provide a useful tool to establish an appropriate dose-reduction strategy to patients in nuclear medicine environments. (authors)« less

  7. Parallel Scaling Characteristics of Selected NERSC User ProjectCodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, David; Verdier, Francesca; Anand, Harsh

    This report documents parallel scaling characteristics of NERSC user project codes between Fiscal Year 2003 and the first half of Fiscal Year 2004 (Oct 2002-March 2004). The codes analyzed cover 60% of all the CPU hours delivered during that time frame on seaborg, a 6080 CPU IBM SP and the largest parallel computer at NERSC. The scale in terms of concurrency and problem size of the workload is analyzed. Drawing on batch queue logs, performance data and feedback from researchers we detail the motivations, benefits, and challenges of implementing highly parallel scientific codes on current NERSC High Performance Computing systems.more » An evaluation and outlook of the NERSC workload for Allocation Year 2005 is presented.« less

  8. Study of the impact of artificial articulations on the dose distribution under medical irradiation

    NASA Astrophysics Data System (ADS)

    Buffard, E.; Gschwind, R.; Makovicka, L.; Martin, E.; Meunier, C.; David, C.

    2005-02-01

    Perturbations due to the presence of high density heterogeneities in the body are not correctly taken into account in the Treatment Planning Systems currently available for external radiotherapy. For this reason, the accuracy of the dose distribution calculations has to be improved by using Monte Carlo simulations. In a previous study, we established a theoretical model by using the Monte Carlo code EGSnrc [I. Kawrakow, D.W.O. Rogers, The EGSnrc code system: MC simulation of electron and photon transport. Technical Report PIRS-701, NRCC, Ottawa, Canada, 2000] in order to obtain the dose distributions around simple heterogeneities. These simulations were then validated by experimental results obtained with thermoluminescent dosemeters and an ionisation chamber. The influence of samples composed of hip prostheses materials (titanium alloy and steel) and a substitute of bone were notably studied. A more complex model was then developed with the Monte Carlo code BEAMnrc [D.W.O. Rogers, C.M. MA, G.X. Ding, B. Walters, D. Sheikh-Bagheri, G.G. Zhang, BEAMnrc Users Manual. NRC Report PPIRS 509(a) rev F, 2001] in order to take into account the hip prosthesis geometry. The simulation results were compared to experimental measurements performed in a water phantom, in the case of a standard treatment of a pelvic cancer for one of the beams passing through the implant. These results have shown the great influence of the prostheses on the dose distribution.

  9. Simulation of the Formation of DNA Double Strand Breaks and Chromosome Aberrations in Irradiated Cells

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Ponomarev, Artem L.; Wu, Honglu; Blattnig, Steve; George, Kerry

    2014-01-01

    The formation of DNA double-strand breaks (DSBs) and chromosome aberrations is an important consequence of ionizing radiation. To simulate DNA double-strand breaks and the formation of chromosome aberrations, we have recently merged the codes RITRACKS (Relativistic Ion Tracks) and NASARTI (NASA Radiation Track Image). The program RITRACKS is a stochastic code developed to simulate detailed event-by-event radiation track structure: [1] This code is used to calculate the dose in voxels of 20 nm, in a volume containing simulated chromosomes, [2] The number of tracks in the volume is calculated for each simulation by sampling a Poisson distribution, with the distribution parameter obtained from the irradiation dose, ion type and energy. The program NASARTI generates the chromosomes present in a cell nucleus by random walks of 20 nm, corresponding to the size of the dose voxels, [3] The generated chromosomes are located within domains which may intertwine, and [4] Each segment of the random walks corresponds to approx. 2,000 DNA base pairs. NASARTI uses pre-calculated dose at each voxel to calculate the probability of DNA damage at each random walk segment. Using the location of double-strand breaks, possible rejoining between damaged segments is evaluated. This yields various types of chromosomes aberrations, including deletions, inversions, exchanges, etc. By performing the calculations using various types of radiations, it will be possible to obtain relative biological effectiveness (RBE) values for several types of chromosome aberrations.

  10. ZEPrompt: An Algorithm for Rapid Estimation of Building Attenuation for Prompt Radiation from a Nuclear Detonation

    DTIC Science & Technology

    2014-01-01

    and 50 kT, to within 30% of first-principles code ( MCNP ) for complicated cities and 10% for simpler cities. 15. SUBJECT TERMS Radiation Transport...Use of MCNP for Dose Calculations .................................................................... 3 2.3 MCNP Open-Field Absorbed Dose...Calculations .................................................. 4 2.4 The MCNP Urban Model

  11. Challenges in validating the sterilisation dose for processed human amniotic membranes

    NASA Astrophysics Data System (ADS)

    Yusof, Norimah; Hassan, Asnah; Firdaus Abd Rahman, M. N.; Hamid, Suzina A.

    2007-11-01

    Most of the tissue banks in the Asia Pacific region have been using ionising radiation at 25 kGy to sterilise human tissues for save clinical usage. Under tissue banking quality system, any dose employed for sterilisation has to be validated and the validation exercise has to be a part of quality document. Tissue grafts, unlike medical items, are not produced in large number per each processing batch and tissues relatively have a different microbial population. A Code of Practice established by the International Atomic Energy Agency (IAEA) in 2004 offers several validation methods using smaller number of samples compared to ISO 11137 (1995), which is meant for medical products. The methods emphasise on bioburden determination, followed by sterility test on samples after they were exposed to verification dose for attaining of sterility assurance level (SAL) of 10 -1. This paper describes our experience in using the IAEA Code of Practice in conducting the validation exercise for substantiating 25 kGy as sterilisation dose for both air-dried amnion and those preserved in 99% glycerol.

  12. Mars surface radiation exposure for solar maximum conditions and 1989 solar proton events

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.

    1992-01-01

    The Langley heavy-ion/nucleon transport code, HZETRN, and the high-energy nucleon transport code, BRYNTRN, are used to predict the propagation of galactic cosmic rays (GCR's) and solar flare protons through the carbon dioxide atmosphere of Mars. Particle fluences and the resulting doses are estimated on the surface of Mars for GCR's during solar maximum conditions and the Aug., Sep., and Oct. 1989 solar proton events. These results extend previously calculated surface estimates for GCR's at solar minimum conditions and the Feb. 1956, Nov. 1960, and Aug. 1972 solar proton events. Surface doses are estimated with both a low-density and a high-density carbon dioxide model of the atmosphere for altitudes of 0, 4, 8, and 12 km above the surface. A solar modulation function is incorporated to estimate the GCR dose variation between solar minimum and maximum conditions over the 11-year solar cycle. By using current Mars mission scenarios, doses to the skin, eye, and blood-forming organs are predicted for short- and long-duration stay times on the Martian surface throughout the solar cycle.

  13. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-09-01

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  14. Microscale synthesis and characterization of polystyrene: NSF-POLYED scholars project

    NASA Technical Reports Server (NTRS)

    Quaal, Karen S.; Wu, Chang-Ning

    1994-01-01

    Polystyrene is a familiar polymer with many commercial uses. Its applications range from the clear, high index of refraction, brittle plastic used to form audio cassette and CD cases to the foamed material used in insulated drink cups and packaging material. Polystyrene constitutes 11 percent of the plastics used in packaging with only High Density Polyethylene (HDPE) and Low Density Polyethylene (LDPE) contributing a larger share: so much polystyrene is used today, it is one of six common plastics that manufacturers have assigned an identification code. The code helps recycling efforts. Polystyrene's code is (PS code 6). During the summer and fall of 1992 several new polymeric experiments were developed by the NSF POLYED Scholars for introduction into the chemistry core curriculum. In this presentation, one such project will be discussed. This laboratory project is recommended for a first or second year laboratory course allowing the introduction of polymeric science to undergraduates at the earliest opportunity. The reliability of the experiments which make up this project and the recognition factor of polystyrene, a material we come in contact with everyday, makes the synthesis and characterization of polystyrene a good choice for the introduction of polymerization to undergraduates. This laboratory project appeals to the varied interests of students enrolled in the typical first year chemistry course and becomes an ideal way to introduce polymers to a wide variety of science and engineering students.

  15. In Vivo Differentiation of Uric Acid Versus Non-Uric Acid Urinary Calculi With Third-Generation Dual-Source Dual-Energy CT at Reduced Radiation Dose.

    PubMed

    Franken, Axelle; Gevenois, Pierre Alain; Muylem, Alain Van; Howarth, Nigel; Keyzer, Caroline

    2018-02-01

    The objective of our study was to evaluate in vivo urinary calculus characterization with third-generation dual-source dual-energy CT (DECT) at reduced versus standard radiation dose. One hundred fifty-three patients requiring unenhanced CT for suspected or known urolithiasis were prospectively included in our study. They underwent two acquisitions at reduced-dose CT (90 kV and 50 mAs ref ; Sn150 kV and 31 mAs ref , where Sn denotes the interposition of a tin filter in the high-energy beam) and standard-dose CT (90 kV and 50 mAs ref ; Sn150 kV and 94 mAs ref ). One radiologist interpreted the reduced-dose examinations before the standard-dose examinations during the same session. Among 103 patients (23 women, 80 men; mean age ± SD, 50 ± 15 years; age range, 18-82 years) with urolithiasis, dedicated DECT software measured the maximal diameter and CT numbers, calculated the DECT number ratio, and labeled with a color code each calculus visualized by the radiologist as uric acid (UA) or non-UA. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. The radiologist visualized 279 calculi on standard-dose CT and 262 on reduced-dose CT; 17 calculi were missed on reduced-dose CT, all of which were ≤ 3 mm. Among the 262 calculi visualized at both doses, the CT number ratio was obtained with the software for 227 calculi and was not different between the doses (p = 0.093). Among these 262 calculi, 197 were labeled at both doses; 194 of the 197 labeled calculi were labeled with the same color code. Among the 65 remaining calculi, 48 and 61 (all ≤ 5 mm) were not labeled at standard-dose and reduced-dose CT (p = 0.005), respectively. At reduced-dose CT, the mean CTDI vol was 2.67 mGy and the mean DLP was 102.2 mGy × cm. With third-generation dual-source DECT, a larger proportion of calculi ≤ 5 mm are not characterized as UA or non-UA at a reduced dose.

  16. Transformation of two and three-dimensional regions by elliptic systems

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne

    1993-01-01

    During this contract period, our work has focused on improvements to elliptic grid generation methods. There are two principle objectives in this project. One objective is to make the elliptic methods more reliable and efficient, and the other is to construct a modular code that can be incorporated into the National Grid Project (NGP), or any other grid generation code. Progress has been made in meeting both of these objectives. The two objectives are actually complementary. As the code development for the NGP progresses, we see many areas where improvements in algorithms can be made.

  17. Scientific Programming Using Java and C: A Remote Sensing Example

    NASA Technical Reports Server (NTRS)

    Prados, Donald; Johnson, Michael; Mohamed, Mohamed A.; Cao, Chang-Yong; Gasser, Jerry; Powell, Don; McGregor, Lloyd

    1999-01-01

    This paper presents results of a project to port code for processing remotely sensed data from the UNIX environment to Windows. Factors considered during this process include time schedule, cost, resource availability, reuse of existing code, rapid interface development, ease of integration, and platform independence. The approach selected for this project used both Java and C. By using Java for the graphical user interface and C for the domain model, the strengths of both languages were utilized and the resulting code can easily be ported to other platforms. The advantages of this approach are discussed in this paper.

  18. Guide to Permitting Hydrogen Motor Fuel Dispensing Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivkin, Carl; Buttner, William; Burgess, Robert

    2016-03-28

    The purpose of this guide is to assist project developers, permitting officials, code enforcement officials, and other parties involved in developing permit applications and approving the implementation of hydrogen motor fuel dispensing facilities. The guide facilitates the identification of the elements to be addressed in the permitting of a project as it progresses through the approval process; the specific requirements associated with those elements; and the applicable (or potentially applicable) codes and standards by which to determine whether the specific requirements have been met. The guide attempts to identify all applicable codes and standards relevant to the permitting requirements.

  19. Monte Carlo simulations and benchmark measurements on the response of TE(TE) and Mg(Ar) ionization chambers in photon, electron and neutron beams

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei

    2015-06-01

    The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7.8-16.5% below 120 kVp X-ray beams. In this study, we were especially interested in BNCT doses where low energy photon contribution is less to ignore, MCNP model is recognized as the most suitable to simulate wide photon-electron and neutron energy distributed responses of the paired ICs. Also, MCNP provides the best prediction of BNCT source adjustment by the detector's neutron and photon responses.

  20. SU-E-T-556: Monte Carlo Generated Dose Distributions for Orbital Irradiation Using a Single Anterior-Posterior Electron Beam and a Hanging Lens Shield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duwel, D; Lamba, M; Elson, H

    Purpose: Various cancers of the eye are successfully treated with radiotherapy utilizing one anterior-posterior (A/P) beam that encompasses the entire content of the orbit. In such cases, a hanging lens shield can be used to spare dose to the radiosensitive lens of the eye to prevent cataracts. Methods: This research focused on Monte Carlo characterization of dose distributions resulting from a single A-P field to the orbit with a hanging shield in place. Monte Carlo codes were developed which calculated dose distributions for various electron radiation energies, hanging lens shield radii, shield heights above the eye, and beam spoiler configurations.more » Film dosimetry was used to benchmark the coding to ensure it was calculating relative dose accurately. Results: The Monte Carlo dose calculations indicated that lateral and depth dose profiles are insensitive to changes in shield height and electron beam energy. Dose deposition was sensitive to shield radius and beam spoiler composition and height above the eye. Conclusion: The use of a single A/P electron beam to treat cancers of the eye while maintaining adequate lens sparing is feasible. Shield radius should be customized to have the same radius as the patient’s lens. A beam spoiler should be used if it is desired to substantially dose the eye tissues lying posterior to the lens in the shadow of the lens shield. The compromise between lens sparing and dose to diseased tissues surrounding the lens can be modulated by varying the beam spoiler thickness, spoiler material composition, and spoiler height above the eye. The sparing ratio is a metric that can be used to evaluate the compromise between lens sparing and dose to surrounding tissues. The higher the ratio, the more dose received by the tissues immediately posterior to the lens relative to the dose received by the lens.« less

  1. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.

  2. LANL LDRD-funded project: Test particle simulations of energetic ions in natural and artificial radiation belts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowee, Misa; Liu, Kaijun; Friedel, Reinhard H.

    2012-07-17

    We summarize the scientific problem and work plan for the LANL LDRD-funded project to use a test particle code to study the sudden de-trapping of inner belt protons and possible cross-L transport of debris ions after a high altitude nuclear explosion (HANE). We also discuss future application of the code for other HANE-related problems.

  3. Development of a Top-View Numeric Coding Teaching-Learning Trajectory within an Elementary Grades 3-D Visualization Design Research Project

    ERIC Educational Resources Information Center

    Sack, Jacqueline J.

    2013-01-01

    This article explicates the development of top-view numeric coding of 3-D cube structures within a design research project focused on 3-D visualization skills for elementary grades children. It describes children's conceptual development of 3-D cube structures using concrete models, conventional 2-D pictures and abstract top-view numeric…

  4. The use of tetrahedral mesh geometries in Monte Carlo simulation of applicator based brachytherapy dose distributions

    NASA Astrophysics Data System (ADS)

    Paiva Fonseca, Gabriel; Landry, Guillaume; White, Shane; D'Amours, Michel; Yoriyaz, Hélio; Beaulieu, Luc; Reniers, Brigitte; Verhaegen, Frank

    2014-10-01

    Accounting for brachytherapy applicator attenuation is part of the recommendations from the recent report of AAPM Task Group 186. To do so, model based dose calculation algorithms require accurate modelling of the applicator geometry. This can be non-trivial in the case of irregularly shaped applicators such as the Fletcher Williamson gynaecological applicator or balloon applicators with possibly irregular shapes employed in accelerated partial breast irradiation (APBI) performed using electronic brachytherapy sources (EBS). While many of these applicators can be modelled using constructive solid geometry (CSG), the latter may be difficult and time-consuming. Alternatively, these complex geometries can be modelled using tessellated geometries such as tetrahedral meshes (mesh geometries (MG)). Recent versions of Monte Carlo (MC) codes Geant4 and MCNP6 allow for the use of MG. The goal of this work was to model a series of applicators relevant to brachytherapy using MG. Applicators designed for 192Ir sources and 50 kV EBS were studied; a shielded vaginal applicator, a shielded Fletcher Williamson applicator and an APBI balloon applicator. All applicators were modelled in Geant4 and MCNP6 using MG and CSG for dose calculations. CSG derived dose distributions were considered as reference and used to validate MG models by comparing dose distribution ratios. In general agreement within 1% for the dose calculations was observed for all applicators between MG and CSG and between codes when considering volumes inside the 25% isodose surface. When compared to CSG, MG required longer computation times by a factor of at least 2 for MC simulations using the same code. MCNP6 calculation times were more than ten times shorter than Geant4 in some cases. In conclusion we presented methods allowing for high fidelity modelling with results equivalent to CSG. To the best of our knowledge MG offers the most accurate representation of an irregular APBI balloon applicator.

  5. Preliminary estimates of radiation exposures for manned interplanetary missions from anomalously large solar flare events

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Nealy, John E.; Wilson, John W.

    1988-01-01

    Preliminary estimates of radiation exposures for manned interplanetary missions resulting from anomalously large solar flare events are presented. The calculations use integral particle fluences for the February 1956, November 1960, and August 1972 events as inputs into the Langley Research Center nucleon transport code BRYNTRN. This deterministic code transports primary and secondary nucleons (protons and neutrons) through any number of layers of target material of arbitrary thickness and composition. Contributions from target nucleus fragmentation and recoil are also included. Estimates of 5 cm depth doses and dose equivalents in tissue are presented behind various thicknesses of aluminum, water, and composite aluminum/water shields for each of the three solar flare events.

  6. WE-AB-204-11: Development of a Nuclear Medicine Dosimetry Module for the GPU-Based Monte Carlo Code ARCHER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less

  7. Measurements and simulations of the radiation exposure to aircraft crew workplaces due to cosmic radiation in the atmosphere.

    PubMed

    Beck, P; Latocha, M; Dorman, L; Pelliccioni, M; Rollet, S

    2007-01-01

    As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircrafts have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at).

  8. TU-F-17A-08: The Relative Accuracy of 4D Dose Accumulation for Lung Radiotherapy Using Rigid Dose Projection Versus Dose Recalculation On Every Breathing Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamb, J; Lee, C; Tee, S

    2014-06-15

    Purpose: To investigate the accuracy of 4D dose accumulation using projection of dose calculated on the end-exhalation, mid-ventilation, or average intensity breathing phase CT scan, versus dose accumulation performed using full Monte Carlo dose recalculation on every breathing phase. Methods: Radiotherapy plans were analyzed for 10 patients with stage I-II lung cancer planned using 4D-CT. SBRT plans were optimized using the dose calculated by a commercially-available Monte Carlo algorithm on the end-exhalation 4D-CT phase. 4D dose accumulations using deformable registration were performed with a commercially available tool that projected the planned dose onto every breathing phase without recalculation, as wellmore » as with a Monte Carlo recalculation of the dose on all breathing phases. The 3D planned dose (3D-EX), the 3D dose calculated on the average intensity image (3D-AVE), and the 4D accumulations of the dose calculated on the end-exhalation phase CT (4D-PR-EX), the mid-ventilation phase CT (4D-PR-MID), and the average intensity image (4D-PR-AVE), respectively, were compared against the accumulation of the Monte Carlo dose recalculated on every phase. Plan evaluation metrics relating to target volumes and critical structures relevant for lung SBRT were analyzed. Results: Plan evaluation metrics tabulated using 4D-PR-EX, 4D-PR-MID, and 4D-PR-AVE differed from those tabulated using Monte Carlo recalculation on every phase by an average of 0.14±0.70 Gy, - 0.11±0.51 Gy, and 0.00±0.62 Gy, respectively. Deviations of between 8 and 13 Gy were observed between the 4D-MC calculations and both 3D methods for the proximal bronchial trees of 3 patients. Conclusions: 4D dose accumulation using projection without re-calculation may be sufficiently accurate compared to 4D dose accumulated from Monte Carlo recalculation on every phase, depending on institutional protocols. Use of 4D dose accumulation should be considered when evaluating normal tissue complication probabilities as well as in clinical situations where target volumes are directly inferior to mobile critical structures.« less

  9. Recent skyshine calculations at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degtyarenko, P.

    1997-12-01

    New calculations of the skyshine dose distribution of neutrons and secondary photons have been performed at Jefferson Lab using the Monte Carlo method. The dose dependence on neutron energy, distance to the neutron source, polar angle of a source neutron, and azimuthal angle between the observation point and the momentum direction of a source neutron have been studied. The azimuthally asymmetric term in the skyshine dose distribution is shown to be important in the dose calculations around high-energy accelerator facilities. A parameterization formula and corresponding computer code have been developed which can be used for detailed calculations of the skyshinemore » dose maps.« less

  10. Studying the laws of software evolution in a long-lived FLOSS project.

    PubMed

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-07-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd.

  11. Studying the laws of software evolution in a long-lived FLOSS project

    PubMed Central

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-01-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd. PMID:25893093

  12. Accumulate-Repeat-Accumulate-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Thorpe, Jeremy

    2007-01-01

    Accumulate-repeat-accumulate-accumulate (ARAA) codes have been proposed, inspired by the recently proposed accumulate-repeat-accumulate (ARA) codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. ARAA codes can be regarded as serial turbolike codes or as a subclass of low-density parity-check (LDPC) codes, and, like ARA codes they have projected graph or protograph representations; these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The objective in proposing ARAA codes as a subclass of ARA codes was to enhance the error-floor performance of ARA codes while maintaining simple encoding structures and low maximum variable node degree.

  13. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-07

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  14. Low-dose caffeine discrimination and self-reported mood effects in normal volunteers.

    PubMed Central

    Silverman, K; Griffiths, R R

    1992-01-01

    A caffeine versus placebo discrimination procedure was used to determine the lowest caffeine dose that could produce discrimination and self-reported mood effects in normal volunteers. During daily sessions under double-blind conditions, caffeine-abstinent subjects orally ingested a capsule containing 178 mg caffeine or placebo. Before beginning discrimination training, the compounds were identified to subjects by letter codes. Fifteen, 30, and 45 min after capsule ingestion, subjects guessed the capsule's letter code. Correct guesses at 45 min earned money. After each session, subjects received a supplementary capsule containing caffeine or placebo to ensure that, within each phase of the study, subjects received the same daily dose of caffeine equal to the training dose. Five of the 15 subjects acquired the caffeine versus placebo discrimination within the first 20 sessions (greater than or equal to 75% correct); 6 other subjects acquired the discrimination with additional training. Nine subjects who acquired the discrimination were subsequently trained at progressively lower caffeine doses. In general, the lowest dose to produce discrimination (greater than or equal to 75% correct) was also the lowest dose to produce self-reported mood effects: 4 subjects showed discrimination and self-reported mood effects at 100 mg caffeine, 2 at 56 mg, 1 at 32 mg, and 1 at 18 mg. One of these subjects also showed self-reported mood effects at 10 mg. The present study documents discriminative stimulus and self-reported mood effects of caffeine at doses below those previously shown to affect any behavior in normal volunteers. PMID:1548451

  15. Secondary Neutron Doses to Pediatric Patients During Intracranial Proton Therapy: Monte Carlo Simulation of the Neutron Energy Spectrum and its Organ Doses.

    PubMed

    Matsumoto, Shinnosuke; Koba, Yusuke; Kohno, Ryosuke; Lee, Choonsik; Bolch, Wesley E; Kai, Michiaki

    2016-04-01

    Proton therapy has the physical advantage of a Bragg peak that can provide a better dose distribution than conventional x-ray therapy. However, radiation exposure of normal tissues cannot be ignored because it is likely to increase the risk of secondary cancer. Evaluating secondary neutrons generated by the interaction of the proton beam with the treatment beam-line structure is necessary; thus, performing the optimization of radiation protection in proton therapy is required. In this research, the organ dose and energy spectrum were calculated from secondary neutrons using Monte Carlo simulations. The Monte Carlo code known as the Particle and Heavy Ion Transport code System (PHITS) was used to simulate the transport proton and its interaction with the treatment beam-line structure that modeled the double scattering body of the treatment nozzle at the National Cancer Center Hospital East. The doses of the organs in a hybrid computational phantom simulating a 5-y-old boy were calculated. In general, secondary neutron doses were found to decrease with increasing distance to the treatment field. Secondary neutron energy spectra were characterized by incident neutrons with three energy peaks: 1×10, 1, and 100 MeV. A block collimator and a patient collimator contributed significantly to organ doses. In particular, the secondary neutrons from the patient collimator were 30 times higher than those from the first scatter. These results suggested that proactive protection will be required in the design of the treatment beam-line structures and that organ doses from secondary neutrons may be able to be reduced.

  16. Radiological characterization of the pressure vessel internals of the BNL High Flux Beam Reactor.

    PubMed

    Holden, Norman E; Reciniello, Richard N; Hu, Jih-Perng

    2004-08-01

    In preparation for the eventual decommissioning of the High Flux Beam Reactor after the permanent removal of its fuel elements from the Brookhaven National Laboratory, measurements and calculations of the decay gamma-ray dose-rate were performed in the reactor pressure vessel and on vessel internal structures such as the upper and lower thermal shields, the Transition Plate, and the Control Rod blades. Measurements of gamma-ray dose rates were made using Red Perspex polymethyl methacrylate high-dose film, a Radcal "peanut" ion chamber, and Eberline's RO-7 high-range ion chamber. As a comparison, the Monte Carlo MCNP code and MicroShield code were used to model the gamma-ray transport and dose buildup. The gamma-ray dose rate at 8 cm above the center of the Transition Plate was measured to be 160 Gy h (using an RO-7) and 88 Gy h at 8 cm above and about 5 cm lateral to the Transition Plate (using Red Perspex film). This compares with a calculated dose rate of 172 Gy h using Micro-Shield. The gamma-ray dose rate was 16.2 Gy h measured at 76 cm from the reactor core (using the "peanut" ion chamber) and 16.3 Gy h at 87 cm from the core (using Red Perspex film). The similarity of dose rates measured with different instruments indicates that using different methods and instruments is acceptable if the measurement (and calculation) parameters are well defined. Different measurement techniques may be necessary due to constraints such as size restrictions.

  17. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  18. [Absorbed dose and the effective dose of panoramic temporo mandibular joint radiography].

    PubMed

    Matsuo, Ayae; Okano, Tsuneichi; Gotoh, Kenichi; Yokoi, Midori; Hirukawa, Akiko; Okumura, Shinji; Koyama, Syuji

    2011-01-01

    This study measured the radiation doses absorbed by the patient during Panoramic temporo mandibular joint radiography (Panoramic TMJ), Schüllers method and Orbitoramus projection. The dose of the frontal view in Panoramic TMJ was compared to that with Orbitoramus projection and the lateral view in Panoramic TMJ was compared to that with Schüllers method. We measured the doses received by various organs and calculated the effective doses using the guidelines of the International Commission on Radiological Protection in Publication 103. Organ absorbed doses were measured using an anthropomorphic phantom, loaded with thermoluminescent dosimeters (TLD), located at 160 sensitive sites. The dose shows the sum value of irradiation on both the right and left sides. In addition, we set a few different exposure field sizes. The effective dose for a frontal view in Panoramic TMJ was 11 µSv, and that for the lateral view was 14 µSv. The lens of the Orbitoramus projection was 40 times higher than the frontal view in Panoramic TMJ. Although the effective dose of the lateral view in Panoramic TMJ was 3 times higher than that of the small exposure field (10×10 cm on film) in Schüller's method, it was the same as that of a mid-sized exposure field. When the exposure field in the inferior 1/3 was reduced during panoramic TMJ, the effective doses could be decreased. Therefore we recommend that the size of the exposure field in Panoramic TMJ be decreased.

  19. A sublethal dose of a neonicotinoid insecticide disrupts visual processing and collision avoidance behaviour in Locusta migratoria.

    PubMed

    Parkinson, Rachel H; Little, Jacelyn M; Gray, John R

    2017-04-20

    Neonicotinoids are known to affect insect navigation and vision, however the mechanisms of these effects are not fully understood. A visual motion sensitive neuron in the locust, the Descending Contralateral Movement Detector (DCMD), integrates visual information and is involved in eliciting escape behaviours. The DCMD receives coded input from the compound eyes and monosynaptically excites motorneurons involved in flight and jumping. We show that imidacloprid (IMD) impairs neural responses to visual stimuli at sublethal concentrations, and these effects are sustained two and twenty-four hours after treatment. Most significantly, IMD disrupted bursting, a coding property important for motion detection. Specifically, IMD reduced the DCMD peak firing rate within bursts at ecologically relevant doses of 10 ng/g (ng IMD per g locust body weight). Effects on DCMD firing translate to deficits in collision avoidance behaviours: exposure to 10 ng/g IMD attenuates escape manoeuvers while 100 ng/g IMD prevents the ability to fly and walk. We show that, at ecologically-relevant doses, IMD causes significant and lasting impairment of an important pathway involved with visual sensory coding and escape behaviours. These results show, for the first time, that a neonicotinoid pesticide directly impairs an important, taxonomically conserved, motion-sensitive visual network.

  20. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    NASA Astrophysics Data System (ADS)

    Hilmy, N.; Febrida, A.; Basril, A.

    2007-11-01

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  1. Multiple scattering of 13 and 20 MeV electrons by thin foils: a Monte Carlo study with GEANT, Geant4, and PENELOPE.

    PubMed

    Vilches, M; García-Pareja, S; Guerrero, R; Anguiano, M; Lallena, A M

    2009-09-01

    In this work, recent results from experiments and simulations (with EGSnrc) performed by Ross et al. [Med. Phys. 35, 4121-4131 (2008)] on electron scattering by foils of different materials and thicknesses are compared to those obtained using several Monte Carlo codes. Three codes have been used: GEANT (version 3.21), Geant4 (version 9.1, patch03), and PENELOPE (version 2006). In the case of PENELOPE, mixed and fully detailed simulations have been carried out. Transverse dose distributions in air have been obtained in order to compare with measurements. The detailed PENELOPE simulations show excellent agreement with experiment. The calculations performed with GEANT and PENELOPE (mixed) agree with experiment within 3% except for the Be foil. In the case of Geant4, the distributions are 5% narrower compared to the experimental ones, though the agreement is very good for the Be foil. Transverse dose distribution in water obtained with PENELOPE (mixed) is 4% wider than those calculated by Ross et al. using EGSnrc and is 1% narrower than the transverse dose distributions in air, as considered in the experiment. All the codes give a reasonable agreement (within 5%) with the experimental results for all the material and thicknesses studied.

  2. PRESTO-II: a low-level waste environmental transport and risk assessment code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fields, D.E.; Emerson, C.J.; Chester, R.O.

    PRESTO-II (Prediction of Radiation Effects from Shallow Trench Operations) is a computer code designed for the evaluation of possible health effects from shallow-land and, waste-disposal trenches. The model is intended to serve as a non-site-specific screening model for assessing radionuclide transport, ensuing exposure, and health impacts to a static local population for a 1000-year period following the end of disposal operations. Human exposure scenarios considered include normal releases (including leaching and operational spillage), human intrusion, and limited site farming or reclamation. Pathways and processes of transit from the trench to an individual or population include ground-water transport, overland flow, erosion,more » surface water dilution, suspension, atmospheric transport, deposition, inhalation, external exposure, and ingestion of contaminated beef, milk, crops, and water. Both population doses and individual doses, as well as doses to the intruder and farmer, may be calculated. Cumulative health effects in terms of cancer deaths are calculated for the population over the 1000-year period using a life-table approach. Data are included for three example sites: Barnwell, South Carolina; Beatty, Nevada; and West Valley, New York. A code listing and example input for each of the three sites are included in the appendices to this report.« less

  3. Optimization of beam shaping assembly based on D-T neutron generator and dose evaluation for BNCT

    NASA Astrophysics Data System (ADS)

    Naeem, Hamza; Chen, Chaobin; Zheng, Huaqing; Song, Jing

    2017-04-01

    The feasibility of developing an epithermal neutron beam for a boron neutron capture therapy (BNCT) facility based on a high intensity D-T fusion neutron generator (HINEG) and using the Monte Carlo code SuperMC (Super Monte Carlo simulation program for nuclear and radiation process) is proposed in this study. The Monte Carlo code SuperMC is used to determine and optimize the final configuration of the beam shaping assembly (BSA). The optimal BSA design in a cylindrical geometry which consists of a natural uranium sphere (14 cm) as a neutron multiplier, AlF3 and TiF3 as moderators (20 cm each), Cd (1 mm) as a thermal neutron filter, Bi (5 cm) as a gamma shield, and Pb as a reflector and collimator to guide neutrons towards the exit window. The epithermal neutron beam flux of the proposed model is 5.73 × 109 n/cm2s, and other dosimetric parameters for the BNCT reported by IAEA-TECDOC-1223 have been verified. The phantom dose analysis shows that the designed BSA is accurate, efficient and suitable for BNCT applications. Thus, the Monte Carlo code SuperMC is concluded to be capable of simulating the BSA and the dose calculation for BNCT, and high epithermal flux can be achieved using proposed BSA.

  4. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  5. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  6. Investigation of amorphization energies for heavy ion implants into silicon carbide at depths far beyond the projected ranges

    NASA Astrophysics Data System (ADS)

    Friedland, E.

    2017-01-01

    At ion energies with inelastic stopping powers less than a few keV/nm, radiation damage is thought to be due to atomic displacements by elastic collisions only. However, it is well known that inelastic processes and non-linear effects due to defect interaction within collision cascades can significantly increase or decrease damage efficiencies. The importance of these processes changes significantly along the ion trajectory and becomes negligible at some distance beyond the projected range, where damage is mainly caused by slowly moving secondary recoils. Hence, in this region amorphization energies should become independent of the ion type and only reflect the properties of the target lattice. To investigate this, damage profiles were obtained from α-particle channeling spectra of 6H-SiC wafers implanted at room temperature with ions in the mass range 84 ⩽ M ⩽ 133, employing the computer code DICADA. An average amorphization dose of (0.7 ± 0.2) dpa and critical damage energy of (17 ± 6) eV/atom are obtained from TRIM simulations at the experimentally observed boundary positions of the amorphous zones.

  7. Bandwidth Efficient Modulation and Coding Techniques for NASA's Existing Ku/Ka-Band 225 MHz Wide Service

    NASA Technical Reports Server (NTRS)

    Gioannini, Bryan; Wong, Yen; Wesdock, John

    2005-01-01

    The National Aeronautics and Space Administration (NASA) has recently established the Tracking and Data Relay Satellite System (TDRSS) K-band Upgrade Project (TKUP), a project intended to enhance the TDRSS Ku-band and Ka-band Single Access Return 225 MHz (Ku/KaSAR-225) data service by adding the capability to process bandwidth efficient signal design and to replace the White Sand Complex (WSC) KSAR high data rate ground equipment and high rate switches which are nearing obsolescence. As a precursor to this project, a modulation and coding study was performed to identify signal structures which maximized the data rate through the Ku/KaSAR-225 channel, minimized the required customer EIRP and ensured acceptable hardware complexity on the customer platform. This paper presents the results and conclusions of the TKUP modulation and coding study.

  8. Praseodymium-142 glass seeds for the brachytherapy of prostate cancer

    NASA Astrophysics Data System (ADS)

    Jung, Jae Won

    A beta-emitting glass seed was proposed for the brachytherapy treatment of prostate cancer. Criteria for seed design were derived and several beta-emitting nuclides were examined for suitability. 142Pr was selected as the isotope of choice. Seeds 0.08 cm in diameter and 0.9 cm long were manufactured for testing. The seeds were activated in the Texas A&M University research reactor. The activity produced was as expected when considering the meta-stable state and epi-thermal neutron flux. The MCNP5 Monte Carlo code was used to calculate the quantitative dosimetric parameters suggested in the American Association of Physicists in Medicine (AAPM) TG-43/60. The Monte Carlo calculation results were compared with those from a dose point kernel code. The dose profiles agree well with each other. The gamma dose of 142Pr was evaluated. The gamma dose is 0.3 Gy at 1.0 cm with initial activity of 5.95 mCi and is insignificant to other organs. Measurements were performed to assess the 2-dimensional axial dose distributions using Gafchromic radiochromic film. The radiochromic film was calibrated using an X-ray machine calibrated against a National Institute of Standards and Technology (NIST) traceable ion chamber. A calibration curve was derived using a least squares fit of a second order polynomial. The measured dose distribution agrees well with results from the Monte Carlo simulation. The dose was 130.8 Gy at 6 mm from the seed center with initial activity of 5.95 mCi. AAPM TG-43/60 parameters were determined. The reference dose rate for 2 mm and 6 mm were 0.67 and 0.02 cGy/s/mCi, respectively. The geometry function, radial dose function and anisotropy function were generated.

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; descriptive questionnaire.

    The National Human Exposure Assessment...

  10. Detection of calcification clusters in digital breast tomosynthesis slices at different dose levels utilizing a SRSAR reconstruction and JAFROC

    NASA Astrophysics Data System (ADS)

    Timberg, P.; Dustler, M.; Petersson, H.; Tingberg, A.; Zackrisson, S.

    2015-03-01

    Purpose: To investigate detection performance for calcification clusters in reconstructed digital breast tomosynthesis (DBT) slices at different dose levels using a Super Resolution and Statistical Artifact Reduction (SRSAR) reconstruction method. Method: Simulated calcifications with irregular profile (0.2 mm diameter) where combined to form clusters that were added to projection images (1-3 per abnormal image) acquired on a DBT system (Mammomat Inspiration, Siemens). The projection images were dose reduced by software to form 35 abnormal cases and 25 normal cases as if acquired at 100%, 75% and 50% dose level (AGD of approximately 1.6 mGy for a 53 mm standard breast, measured according to EUREF v0.15). A standard FBP and a SRSAR reconstruction method (utilizing IRIS (iterative reconstruction filters), and outlier detection using Maximum-Intensity Projections and Average-Intensity Projections) were used to reconstruct single central slices to be used in a Free-response task (60 images per observer and dose level). Six observers participated and their task was to detect the clusters and assign confidence rating in randomly presented images from the whole image set (balanced by dose level). Each trial was separated by one weeks to reduce possible memory bias. The outcome was analyzed for statistical differences using Jackknifed Alternative Free-response Receiver Operating Characteristics. Results: The results indicate that it is possible reduce the dose by 50% with SRSAR without jeopardizing cluster detection. Conclusions: The detection performance for clusters can be maintained at a lower dose level by using SRSAR reconstruction.

  11. Overview of Edge Simulation Laboratory (ESL)

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Dorr, M.; Hittinger, J.; Rognlien, T.; Umansky, M.; Xiong, A.; Xu, X.; Belli, E.; Candy, J.; Snyder, P.; Colella, P.; Martin, D.; Sternberg, T.; van Straalen, B.; Bodi, K.; Krasheninnikov, S.

    2006-10-01

    The ESL is a new collaboration to build a full-f electromagnetic gyrokinetic code for tokamak edge plasmas using continuum methods. Target applications are edge turbulence and transport (neoclassical and anomalous), and edge-localized modes. Initially the project has three major threads: (i) verification and validation of TEMPEST, the project's initial (electrostatic) edge code which can be run in 4D (neoclassical and transport-timescale applications) or 5D (turbulence); (ii) design of the next generation code, which will include more complete physics (electromagnetics, fluid equation option, improved collisions) and advanced numerics (fully conservative, high-order discretization, mapped multiblock grids, adaptivity), and (iii) rapid-prototype codes to explore the issues attached to solving fully nonlinear gyrokinetics with steep radial gradiens. We present a brief summary of the status of each of these activities.

  12. SU-E-CAMPUS-T-03: Four-Dimensional Dose Distribution Measurement Using Plastic Scintillator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hashimoto, M; Kozuka, T; Oguchi, M

    2014-06-15

    Purpose: To develop the detector for the four-dimensional dose distribution measurement. Methods: We made the prototype detector for four-dimensional dose distribution measurement using a cylindrical plastic scintillator (5 cm diameter) and a conical reflection grass. The plastic scintillator is used as a phantom. When the plastic scintillator is irradiated, the scintillation light was emitted according to absorbed dose distribution. The conical reflection grass was arranged to surround the plastic scintillator, which project to downstream the projection images of the scintillation light. Then, the projection image was reflected to 45 degree direction by flat reflection grass, and was recorded by camcorder.more » By reconstructing the three-dimensional dose distribution from the projection image recorded in each frame, we could obtain the four-dimensional dose distribution. First, we tested the characteristic according to the amount of emitted light. Then we compared of the light profile and the dose profile calculated with the radiotherapy treatment planning system. Results: The dose dependency of the amount of light showed linearity. The pixel detecting smaller amount of light had high sensitivity than the pixel detecting larger amount of light. However the difference of the sensitivity could be corrected from the amount of light detected in each pixel. Both of the depth light profile through the conical reflection grass and the depth dose profile showed the same attenuation in the region deeper than peak depth. In lateral direction, the difference of the both profiles was shown at outside field and penumbra region. We consider that the difference is occurred due to the scatter of the scintillation light in the plastic scintillator block. Conclusion: It was possible to obtain the amount of light corresponding to the absorbed dose distribution from the prototype detector. Four-dimensional dose distributions can be reconstructed with high accuracy by the correction of the scattered light.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Sandra F.; Barnett, J. Matthew; Bisping, Lynn E.

    This report documents radionuclide air emissions that result in the 2014 highest effective dose equivalent (EDE) to an offsite member of the public, referred to as the maximally exposed individual (MEI). The report has been prepared in compliance with the Code of Federal Regulations (CFR), Title 40, Protection of the Environment, Part 61, National Emission Standards for Hazardous Air Pollutants (NESHAP), Subpart H, “National Emission Standards for Emissions of Radionuclides Other than Radon from Department of Energy Facilities” and Washington Administrative Code (WAC) Chapter 246-247, “Radiation Protection–Air Emissions.” The dose to the PNNL Campus MEI due to routine major andmore » minor point source emissions in 2014 from PNNL Campus sources is 2E 05 mrem (2E-07 mSv) EDE. The dose from all fugitive sources is 3E-6 mrem (3E-8 mSv) EDE. The dose from radon emissions is 1E-6 mrem (1E-8 mSv) EDE. No nonroutine emissions occurred in 2014. The total radiological dose for 2014 to the MEI from all PNNL Campus radionuclide emissions, including fugitive emissions and radon, is 3E-5 mrem (3E-7 mSv) EDE, or more than 100,000 times smaller than the federal and state standard of 10 mrem/yr, to which the PNNL Campus is in compliance.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Sandra F.; Barnett, J. Matthew; Bisping, Lynn E.

    This report documents radionuclide air emissions that result in the highest effective dose equivalent (EDE) to a member of the public, referred to as the maximally exposed individual (MEI). The report has been prepared in compliance with the Code of Federal Regulations (CFR), Title 40, Protection of the Environment, Part 61, National Emission Standards for Hazardous Air Pollutants (NESHAP), Subpart H, National Emission Standards for Emissions of Radionuclides Other than Radon from Department of Energy Facilities and Washington Administrative Code (WAC) Chapter 246-247, Radiation Protection Air Emissions. The dose to the PNNL Site MEI due to routine major and minormore » point source emissions in 2013 from PNNL Site sources is 2E-05 mrem (2E-07 mSv) EDE. The dose from fugitive emissions (i.e., unmonitored sources) is 2E-6 mrem (2E-8 mSv) EDE. The dose from radon emissions is 1E-11 mrem (1E-13 mSv) EDE. No nonroutine emissions occurred in 2013. The total radiological dose for 2013 to the MEI from all PNNL Site radionuclide emissions, including fugitive emissions and radon, is 2E-5 mrem (2E-7 mSv) EDE, or 100,000 times smaller than the federal and state standard of 10 mrem/yr, to which the PNNL Site is in compliance« less

  15. Biological dose estimation for charged-particle therapy using an improved PHITS code coupled with a microdosimetric kinetic model.

    PubMed

    Sato, Tatsuhiko; Kase, Yuki; Watanabe, Ritsuko; Niita, Koji; Sihver, Lembit

    2009-01-01

    Microdosimetric quantities such as lineal energy, y, are better indexes for expressing the RBE of HZE particles in comparison to LET. However, the use of microdosimetric quantities in computational dosimetry is severely limited because of the difficulty in calculating their probability densities in macroscopic matter. We therefore improved the particle transport simulation code PHITS, providing it with the capability of estimating the microdosimetric probability densities in a macroscopic framework by incorporating a mathematical function that can instantaneously calculate the probability densities around the trajectory of HZE particles with a precision equivalent to that of a microscopic track-structure simulation. A new method for estimating biological dose, the product of physical dose and RBE, from charged-particle therapy was established using the improved PHITS coupled with a microdosimetric kinetic model. The accuracy of the biological dose estimated by this method was tested by comparing the calculated physical doses and RBE values with the corresponding data measured in a slab phantom irradiated with several kinds of HZE particles. The simulation technique established in this study will help to optimize the treatment planning of charged-particle therapy, thereby maximizing the therapeutic effect on tumors while minimizing unintended harmful effects on surrounding normal tissues.

  16. Characterization of Filters Loaded With Reactor Strontium Carbonate - 13203

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josephson, Walter S.; Steen, Franciska H.

    A collection of three highly radioactive filters containing reactor strontium carbonate were being prepared for disposal. All three filters were approximately characterized at the time of manufacture by gravimetric methods. The first filter had been partially emptied, and the quantity of residual activity was uncertain. Dose rate to activity modeling using the Monte-Carlo N Particle (MCNP) code was selected to confirm the gravimetric characterization of the full filters, and to fully characterize the partially emptied filter. Although dose rate to activity modeling using MCNP is a common technique, it is not often used for Bremsstrahlung-dominant materials such as reactor strontium.more » As a result, different MCNP modeling options were compared to determine the optimum approach. This comparison indicated that the accuracy of the results were heavily dependent on the MCNP modeling details and the location of the dose rate measurement point. The optimum model utilized a photon spectrum generated by the Oak Ridge Isotope Generation and Depletion (ORIGEN) code and dose rates measured at 30 cm. Results from the optimum model agreed with the gravimetric estimates within 15%. It was demonstrated that dose rate to activity modeling can be successful for Bremsstrahlung-dominant radioactive materials. However, the degree of success is heavily dependent on the choice of modeling techniques. (authors)« less

  17. Effective dose reduction in spine radiographic imaging by choosing the less radiation-sensitive side of the body.

    PubMed

    Ben-Shlomo, Avi; Bartal, Gabriel; Mosseri, Morris; Avraham, Boaz; Leitner, Yosef; Shabat, Shay

    2016-04-01

    X-ray absorption is highest in the organs and tissues located closest to the radiation source. The photon flux that crosses the body decreases from the entry surface toward the image receptor. The internal organs absorb x-rays and shield each other during irradiation. Therefore, changing the x-ray projection angle relative to the patient for specific spine procedures changes the radiation dose that each organ receives. Every organ has different radiation sensitivity, so irradiation from different sides of the body changes the biological influence and radiation risk potential on the total body, that is the effective dose (ED). The study aimed to determine the less radiation-sensitive sides of the body during lateral and anterior-posterior (AP) or posterior anterior (PA) directions. The study used exposure of patient phantoms and Monte Carlo simulation of the effective doses. Calculations for adults and 10-year-old children were included because the pediatric population has a greater lifetime radiation risk than adults. Pediatric and adult tissue and organ doses and ED from cervical, thoracic, and lumbar x-ray spine examinations were performed from different projections. Standard mathematical phantoms for adults and 10-year-old children, using PCXMC 2.0 software based on Monte Carlo simulations, were used to calculate pediatric and adult tissue and organ doses and ED. The study was not funded. The authors have no conflicts of interest to declare. Spine x-ray exposure from various right (RT) LAT projection angles was associated with lower ED compared with the same left (LT) LAT projections (up to 28% and 27% less for children aged 10 and adults, respectively). The PA spine projections showed up to 64% lower ED for children aged 10 and 65% for adults than AP projections. The AP projection at the thoracic spine causes an excess breast dose of 543.3% and 597.0% for children aged 10 and adults, respectively. Radiation ED in spine procedures can be significantly reduced by performing x-ray exposures through the less radiation-sensitive sides of the body, which are PA in the frontal position and right lateral in the lateral position. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Documentation for the Southeast Asia seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth

    2007-01-01

    The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).

  19. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    NASA Astrophysics Data System (ADS)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  20. Differences in effective dose and energy imparted estimation from PA AP, RLAT LLAT projections in pediatric full spine x-ray examination using the Monte Carlo technique

    NASA Astrophysics Data System (ADS)

    Gialousis, George I.; Yakoumakis, Emmanouel N.; Papadopoulou, Despina I.; Makri, Triantafillia K.; Yakoumakis, Nikolaos E.; Dimitriou, Panayiotis A.; Georgiou, Evangelos K.

    2006-01-01

    Effective dose (E) and energy imparted (ɛ) can be used to quantify the risk of radiation-induced carcinogenesis or hereditary effects arising from radiographic exposures. When the children are examined or treated for idiopathic scoliokyphosis it is important to estimate E and ɛ in the patients due to full spine x-ray examination. The aim of this study is to calculate E and ɛ in the case of children of 5 and 10 years old who undergo full spine x-ray examination using the Monte Carlo approach. Dose area product (DAP) and entrance surface dose (ESD) were also used. AP, PA, RLAT, LLAT projections are simulated by using appropriate energy spectra. According to the results, the effective dose (E) and the energy imparted (ɛ) are smaller at PA projection than AP, although for spine the opposite occurs, in agreement with previous studies. On the other hand, E and ɛ do not differ statistically among RLAT and LLAT projections. Moreover, the role of lung and bone as tissue inhomogeneities in ɛ is shown to be very important.

  1. Metallic artifact mitigation and organ-constrained tissue assignment for Monte Carlo calculations of permanent implant lung brachytherapy.

    PubMed

    Sutherland, J G H; Miksys, N; Furutani, K M; Thomson, R M

    2014-01-01

    To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxel and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for (125)I, (103)Pd, and (131)Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for (103)Pd seeds and smallest but still considerable differences for (131)Cs seeds. Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.

  2. Downtown Waterfront Form-Based Code Workshop

    EPA Pesticide Factsheets

    This document is a description of a Smart Growth Implementation Assistance for Coastal Communities project in Marquette, Michigan, to develop a form-based code that would attract and support vibrant development.

  3. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 05: Not all geometries are equivalent for magnetic field Fano cavity tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkov, Victor N.; Rogers, David W.O.

    The coupling of MRI and radiation treatment systems for the application of magnetic resonance guided radiation therapy necessitates a reliable magnetic field capable Monte Carlo (MC) code. In addition to the influence of the magnetic field on dose distributions, the question of proper calibration has arisen due to the several percent variation of ion chamber and solid state detector responses in magnetic fields when compared to the 0 T case (Reynolds et al., Med Phys, 2013). In the absence of a magnetic field, EGSnrc has been shown to pass the Fano cavity test (a rigorous benchmarking tool of MC codes)more » at the 0.1 % level (Kawrakow, Med.Phys, 2000), and similar results should be required of magnetic field capable MC algorithms. To properly test such developing MC codes, the Fano cavity theorem has been adapted to function in a magnetic field (Bouchard et al., PMB, 2015). In this work, the Fano cavity test is applied in a slab and ion-chamber-like geometries to test the transport options of an implemented magnetic field algorithm in EGSnrc. Results show that the deviation of the MC dose from the expected Fano cavity theory value is highly sensitive to the choice of geometry, and the ion chamber geometry appears to pass the test more easily than larger slab geometries. As magnetic field MC codes begin to be used for dose simulations and correction factor calculations, care must be taken to apply the most rigorous Fano test geometries to ensure reliability of such algorithms.« less

  4. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    PubMed

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem of ICD-9 coding accuracy by physicians and offers an approach to effectively address this shortcoming. Copyright © 2015. Published by Elsevier Inc.

  5. Radiation dose of digital tomosynthesis for sinonasal examination: comparison with multi-detector CT.

    PubMed

    Machida, Haruhiko; Yuhara, Toshiyuki; Tamura, Mieko; Numano, Tomokazu; Abe, Shinji; Sabol, John M; Suzuki, Shigeru; Ueno, Eiko

    2012-06-01

    Using an anthropomorphic phantom, we have investigated the feasibility of digital tomosynthesis (DT) of flat-panel detector (FPD) radiography to reduce radiation dose for sinonasal examination compared to multi-detector computed tomography (MDCT). A female Rando phantom was scanned covering frontal to maxillary sinus using the clinically routine protocol by both 64-detector CT (120 kV, 200 mAs, and 1.375-pitch) and DT radiography (80 kV, 1.0 mAs per projection, 60 projections, 40° sweep, and posterior-anterior projections). Glass dosimeters were used to measure the radiation dose to internal organs including the thyroid gland, brain, submandibular gland, and the surface dose at various sites including the eyes during those scans. We compared the radiation dose to those anatomies between both modalities. In DT radiography, the doses of the thyroid gland, brain, submandibular gland, skin, and eyes were 230 ± 90 μGy, 1770 ± 560 μGy, 1400 ± 80 μGy, 1160 ± 2100 μGy, and 112 ± 6 μGy, respectively. These doses were reduced to approximately 1/5, 1/8, 1/12, 1/17, and 1/290 of the respective MDCT dose. For sinonasal examinations, DT radiography enables dramatic reduction in radiation exposure and dose to the head and neck region, particularly to the lens of the eye. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. Evaluation of the effective dose during BNCT at TRR thermal column epithermal facility.

    PubMed

    Jarahi, Hossein; Kasesaz, Yaser; Saleh-Koutahi, Seyed Mohsen

    2016-04-01

    An epithermal neutron beam has been designed for Boron neutron Capture Therapy (BNCT) at the thermal column of Tehran Research Reactor (TRR) recently. In this paper the whole body effective dose, as well as the equivalent doses of several organs have been calculated in this facility using MCNP4C Monte Carlo code. The effective dose has been calculated by using the absorbed doses determined for each individual organ, taking into account the radiation and tissue weighting factors. The ICRP 110 whole body male phantom has been used as a patient model. It was found that the effective dose during BNCT of a brain tumor is equal to 0.90Sv. This effective dose may induce a 4% secondary cancer risk. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Scattered radiation doses absorbed by technicians at different distances from X-ray exposure: Experiments on prosthesis.

    PubMed

    Chiang, Hsien-Wen; Liu, Ya-Ling; Chen, Tou-Rong; Chen, Chun-Lon; Chiang, Hsien-Jen; Chao, Shin-Yu

    2015-01-01

    This work aimed to investigate the spatial distribution of scattered radiation doses induced by exposure to the portable X-ray, the C-arm machine, and to simulate the radiologist without a shield of lead clothing, radiation doses absorbed by medical staff at 2 m from the central exposure point. With the adoption of the Rando Phantom, several frequently X-rayed body parts were exposed to X-ray radiation, and the scattered radiation doses were measured by ionization chamber dosimeters at various angles from the patient. Assuming that the central point of the X-ray was located at the belly button, five detection points were distributed in the operation room at 1 m above the ground and 1-2 m from the central point horizontally. The radiation dose measured at point B was the lowest, and the scattered radiation dose absorbed by the prosthesis from the X-ray's vertical projection was 0.07 ±0.03 μGy, which was less than the background radiation levels. The Fluke biomedical model 660-5DE (400 cc) and 660-3DE (4 cc) ion chambers were used to detect air dose at a distance of approximately two meters from the central point. The AP projection radiation doses at point B was the lowest (0.07±0.03 μGy) and the radiation doses at point D was the highest (0.26±0.08 μGy) .Only taking the vertical projection into account, the radiation doses at point B was the lowest (0.52 μGy), and the radiation doses at point E was the highest (4 μGy).The PA projection radiation at point B was the lowest (0.36 μGy) and the radiation doses at point E was the highest(2.77 μGy), occupying 10-32% of the maximum doses. The maximum dose in five directions was nine times to the minimum dose. When the PX and the C-arm machine were used, the radiation doses at a distance of 2 m were attenuated to the background radiation level. The radiologist without a lead shield should stand at point B of patient's feet. Accordingly, teaching materials on radiation safety for radiological interns and clinical technicians were formulated.

  8. Noncoding sequence classification based on wavelet transform analysis: part I

    NASA Astrophysics Data System (ADS)

    Paredes, O.; Strojnik, M.; Romo-Vázquez, R.; Vélez Pérez, H.; Ranta, R.; Garcia-Torales, G.; Scholl, M. K.; Morales, J. A.

    2017-09-01

    DNA sequences in human genome can be divided into the coding and noncoding ones. Coding sequences are those that are read during the transcription. The identification of coding sequences has been widely reported in literature due to its much-studied periodicity. Noncoding sequences represent the majority of the human genome. They play an important role in gene regulation and differentiation among the cells. However, noncoding sequences do not exhibit periodicities that correlate to their functions. The ENCODE (Encyclopedia of DNA elements) and Epigenomic Roadmap Project projects have cataloged the human noncoding sequences into specific functions. We study characteristics of noncoding sequences with wavelet analysis of genomic signals.

  9. Country Report on Building Energy Codes in Australia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shui, Bin; Evans, Meredydd; Somasundaram, Sriram

    2009-04-02

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in Australia, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, and lighting) for commercial and residential buildings in Australia.

  10. Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew F.; Ananthan, Shreyas; Churchfield, Matt

    This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energymore » Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is rather modest in scale, examining performance on problem sizes of O(100 million) elements and core counts up to 8k cores. This will be expanded as more computational resources become available to the projects.« less

  11. SU-E-T-238: Monte Carlo Estimation of Cerenkov Dose for Photo-Dynamic Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chibani, O; Price, R; Ma, C

    Purpose: Estimation of Cerenkov dose from high-energy megavoltage photon and electron beams in tissue and its impact on the radiosensitization using Protoporphyrine IX (PpIX) for tumor targeting enhancement in radiotherapy. Methods: The GEPTS Monte Carlo code is used to generate dose distributions from 18MV Varian photon beam and generic high-energy (45-MV) photon and (45-MeV) electron beams in a voxel-based tissueequivalent phantom. In addition to calculating the ionization dose, the code scores Cerenkov energy released in the wavelength range 375–425 nm corresponding to the pick of the PpIX absorption spectrum (Fig. 1) using the Frank-Tamm formula. Results: The simulations shows thatmore » the produced Cerenkov dose suitable for activating PpIX is 4000 to 5500 times lower than the overall radiation dose for all considered beams (18MV, 45 MV and 45 MeV). These results were contradictory to the recent experimental studies by Axelsson et al. (Med. Phys. 38 (2011) p 4127), where Cerenkov dose was reported to be only two orders of magnitude lower than the radiation dose. Note that our simulation results can be corroborated by a simple model where the Frank and Tamm formula is applied for electrons with 2 MeV/cm stopping power generating Cerenkov photons in the 375–425 nm range and assuming these photons have less than 1mm penetration in tissue. Conclusion: The Cerenkov dose generated by high-energy photon and electron beams may produce minimal clinical effect in comparison with the photon fluence (or dose) commonly used for photo-dynamic therapy. At the present time, it is unclear whether Cerenkov radiation is a significant contributor to the recently observed tumor regression for patients receiving radiotherapy and PpIX versus patients receiving radiotherapy only. The ongoing study will include animal experimentation and investigation of dose rate effects on PpIX response.« less

  12. Reconstruction of organ dose for external radiotherapy patients in retrospective epidemiologic studies

    NASA Astrophysics Data System (ADS)

    Lee, Choonik; Jung, Jae Won; Pelletier, Christopher; Pyakuryal, Anil; Lamart, Stephanie; Kim, Jong Oh; Lee, Choonsik

    2015-03-01

    Organ dose estimation for retrospective epidemiological studies of late effects in radiotherapy patients involves two challenges: radiological images to represent patient anatomy are not usually available for patient cohorts who were treated years ago, and efficient dose reconstruction methods for large-scale patient cohorts are not well established. In the current study, we developed methods to reconstruct organ doses for radiotherapy patients by using a series of computational human phantoms coupled with a commercial treatment planning system (TPS) and a radiotherapy-dedicated Monte Carlo transport code, and performed illustrative dose calculations. First, we developed methods to convert the anatomy and organ contours of the pediatric and adult hybrid computational phantom series to Digital Imaging and Communications in Medicine (DICOM)-image and DICOM-structure files, respectively. The resulting DICOM files were imported to a commercial TPS for simulating radiotherapy and dose calculation for in-field organs. The conversion process was validated by comparing electron densities relative to water and organ volumes between the hybrid phantoms and the DICOM files imported in TPS, which showed agreements within 0.1 and 2%, respectively. Second, we developed a procedure to transfer DICOM-RT files generated from the TPS directly to a Monte Carlo transport code, x-ray Voxel Monte Carlo (XVMC) for more accurate dose calculations. Third, to illustrate the performance of the established methods, we simulated a whole brain treatment for the 10 year-old male phantom and a prostate treatment for the adult male phantom. Radiation doses to selected organs were calculated using the TPS and XVMC, and compared to each other. Organ average doses from the two methods matched within 7%, whereas maximum and minimum point doses differed up to 45%. The dosimetry methods and procedures established in this study will be useful for the reconstruction of organ dose to support retrospective epidemiological studies of late effects in radiotherapy patients.

  13. Side information in coded aperture compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.

    2017-02-01

    Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrationsmore » in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.« less

  16. Quantum image coding with a reference-frame-independent scheme

    NASA Astrophysics Data System (ADS)

    Chapeau-Blondeau, François; Belin, Etienne

    2016-07-01

    For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.

  17. Comparison of LEWICE 1.6 and LEWICE/NS with IRT experimental data from modern air foil tests

    DOT National Transportation Integrated Search

    1998-01-01

    A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This code is modular in ...

  18. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: QUESTIONNAIRE FEEDBACK FORM (UA-D-46.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the coding strategy for the Questionnaire Feedback form. This Questionnaire Feedback form was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; questionnaire feedback form.

    The National Hu...

  19. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DIET DIARY QUESTIONNAIRE (UA-D-43.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Diet Diary Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; diet diary questionnaire.

    The National Human Exposure Assessme...

  20. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: TECHNICIAN WALK-THROUGH QUESTIONNAIRE (UA-D-35.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Technician Walk-Through Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; technician walk-through questionnaire.

    The Nationa...

  1. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR GLOBAL CODING FOR SCANNED FORMS (UA-D-31.1)

    EPA Science Inventory

    The purpose of this SOP is to define the strategy for the Global Coding of Scanned Forms. This procedure applies to the Arizona NHEXAS project and the "Border" study. Keywords: Coding; scannable forms.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal interag...

  2. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Keywords: data; coding; descriptive questionnaire.

    The U.S.-Mexico Border Program is sponso...

  3. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  4. The Impact of Manual Segmentation of CT Images on Monte Carlo Based Skeletal Dosimetry

    NASA Astrophysics Data System (ADS)

    Frederick, Steve; Jokisch, Derek; Bolch, Wesley; Shah, Amish; Brindle, Jim; Patton, Phillip; Wyler, J. S.

    2004-11-01

    Radiation doses to the skeleton from internal emitters are of importance in both protection of radiation workers and patients undergoing radionuclide therapies. Improved dose estimates involve obtaining two sets of medical images. The first image provides the macroscopic boundaries (spongiosa volume and cortical shell) of the individual skeletal sites. A second, higher resolution image of the spongiosa microstructure is also obtained. These image sets then provide the geometry for a Monte Carlo radiation transport code. Manual segmentation of the first image is required in order to provide the macrostructural data. For this study, multiple segmentations of the same CT image were performed by multiple individuals. The segmentations were then used in the transport code and the results compared in order to determine the impact of differing segmentations on the skeletal doses. This work has provided guidance on the extent of training required of the manual segmenters. (This work was supported by a grant from the National Institute of Health.)

  5. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons were learned from this effect. This paper addresses these issues and the resulting methodology.

  6. Radiological Modeling for Determination of Derived Concentration Levels of an Area with Uranium Residual Material - 13533

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Sanchez, Danyl

    As a result of a pilot project developed at the old Spanish 'Junta de Energia Nuclear' to extract uranium from ores, tailings materials were generated. Most of these residual materials were sent back to different uranium mines, but a small amount of it was mixed with conventional building materials and deposited near the old plant until the surrounding ground was flattened. The affected land is included in an area under institutional control and used as recreational area. At the time of processing, uranium isotopes were separated but other radionuclides of the uranium decay series as Th-230, Ra-226 and daughters remainmore » in the residue. Recently, the analyses of samples taken at different ground's depths confirmed their presence. This paper presents the methodology used to calculate the derived concentration level to ensure that the reference dose level of 0.1 mSv y-1 used as radiological criteria. In this study, a radiological impact assessment was performed modeling the area as recreational scenario. The modelization study was carried out with the code RESRAD considering as exposure pathways, external irradiation, inadvertent ingestion of soil, inhalation of resuspended particles, and inhalation of radon (Rn-222). As result was concluded that, if the concentration of Ra-226 in the first 15 cm of soil is lower than, 0.34 Bq g{sup -1}, the dose would not exceed the reference dose. Applying this value as a derived concentration level and comparing with the results of measurements on the ground, some areas with a concentration of activity slightly higher than latter were found. In these zones the remediation proposal has been to cover with a layer of 15 cm of clean material. This action represents a reduction of 85% of the dose and ensures compliance with the reference dose. (authors)« less

  7. UCLA-LANL Reanalysis Project

    NASA Astrophysics Data System (ADS)

    Shprits, Y.; Chen, Y.; Friedel, R.; Kondrashov, D.; Ni, B.; Subbotin, D.; Reeves, G.; Ghil, M.

    2009-04-01

    We present first results of the UCLA-LANL Reanalysis Project. Radiation belt relativistic electron Phase Space Density is obtained using the data assimilative VERB code combined with observations from GEO, CRRES, and Akebono data. Reanalysis of data shows the pronounced peaks in the phase space density and pronounced dropouts of fluxes during the main phase of a storm. The results of the reanalysis are discussed and compared to the simulations with the recently developed VERB 3D code.

  8. LATIS3D: The Goal Standard for Laser-Tissue-Interaction Modeling

    NASA Astrophysics Data System (ADS)

    London, R. A.; Makarewicz, A. M.; Kim, B. M.; Gentile, N. A.; Yang, T. Y. B.

    2000-03-01

    The goal of this LDRD project has been to create LATIS3D-the world's premier computer program for laser-tissue interaction modeling. The development was based on recent experience with the 2D LATIS code and the ASCI code, KULL. With LATIS3D, important applications in laser medical therapy were researched including dynamical calculations of tissue emulsification and ablation, photothermal therapy, and photon transport for photodynamic therapy. This project also enhanced LLNL's core competency in laser-matter interactions and high-energy-density physics by pushing simulation codes into new parameter regimes and by attracting external expertise. This will benefit both existing LLNL programs such as ICF and SBSS and emerging programs in medical technology and other laser applications. The purpose of this project was to develop and apply a computer program for laser-tissue interaction modeling to aid in the development of new instruments and procedures in laser medicine.

  9. Hybrid petacomputing meets cosmology: The Roadrunner Universe project

    NASA Astrophysics Data System (ADS)

    Habib, Salman; Pope, Adrian; Lukić, Zarija; Daniel, David; Fasel, Patricia; Desai, Nehal; Heitmann, Katrin; Hsu, Chung-Hsing; Ankeny, Lee; Mark, Graham; Bhattacharya, Suman; Ahrens, James

    2009-07-01

    The target of the Roadrunner Universe project at Los Alamos National Laboratory is a set of very large cosmological N-body simulation runs on the hybrid supercomputer Roadrunner, the world's first petaflop platform. Roadrunner's architecture presents opportunities and difficulties characteristic of next-generation supercomputing. We describe a new code designed to optimize performance and scalability by explicitly matching the underlying algorithms to the machine architecture, and by using the physics of the problem as an essential aid in this process. While applications will differ in specific exploits, we believe that such a design process will become increasingly important in the future. The Roadrunner Universe project code, MC3 (Mesh-based Cosmology Code on the Cell), uses grid and direct particle methods to balance the capabilities of Roadrunner's conventional (Opteron) and accelerator (Cell BE) layers. Mirrored particle caches and spectral techniques are used to overcome communication bandwidth limitations and possible difficulties with complicated particle-grid interaction templates.

  10. Argonne National Laboratory-East site environmental report for calendar year 1998.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golchert, N.W.; Kolzow, R.G.

    1999-08-26

    This report discusses the results of the environmental protection program at Argonne National Laboratory-East (ANL-E) for 1998. To evaluate the effects of ANL-E operations on the environment, samples of environmental media collected on the site, at the site boundary, and off the ANL-E site were analyzed and compared with applicable guidelines and standards. A variety of radionuclides were measured in air, surface water, on-site groundwater, and bottom sediment samples. In addition, chemical constituents in surface water, groundwater, and ANL-E effluent water were analyzed. External penetrating radiation doses were measured, and the potential for radiation exposure to off-site population groups wasmore » estimated. Results are interpreted in terms of the origin of the radioactive and chemical substances (i.e., natural, fallout, ANL-E, and other) and are compared with applicable environmental quality standards. A US Department of Energy dose calculation methodology, based on International Commission on Radiological Protection recommendations and the US Environmental Protection Agency's CAP-88 (Clean Air Act Assessment Package-1988) computer code, was used in preparing this report. The status of ANL-E environmental protection activities with respect to the various laws and regulations that govern waste handling and disposal is discussed, along with the progress of environmental corrective actions and restoration projects.« less

  11. Occupational dose constraints in interventional cardiology procedures: the DIMOND approach

    NASA Astrophysics Data System (ADS)

    Tsapaki, Virginia; Kottou, Sophia; Vano, Eliseo; Komppa, Tuomo; Padovani, Renato; Dowling, Annita; Molfetas, Michael; Neofotistou, Vassiliki

    2004-03-01

    Radiation fields involved in angiographic suites are most uneven with intensity and gradient varying widely with projection geometry. The European Commission DIMOND III project addressed among others, the issues regarding optimization of staff doses with an attempt to propose preliminary occupational dose constraints. Two thermoluminescent dosemeters (TLD) were used to assess operators' extremity doses (left shoulder and left foot) during 20 coronary angiographies (CAs) and 20 percutaneous transluminal coronary angioplasties (PTCAs) in five European centres. X-ray equipment, radiation protection measures used and the dose delivered to the patient in terms of dose-area product (DAP) were recorded so as to subsequently associate them with operator's dose. The range of staff doses noted for the same TLD position, centre and procedure type emphasizes the importance of protective measures and technical characteristics of x-ray equipment. Correlation of patient's DAP with staff shoulder dose is moderate whereas correlation of patient's DAP with staff foot dose is poor in both CA and PTCA. Therefore, it is difficult to predict operator's dose from patient's DAP mainly due to the different use of protective measures. A preliminary occupational dose constraint value was defined by calculating cardiologists' annual effective dose and found to be 0.6 mSv.

  12. SU-A-BRA-02: Making the Most of a One Hour Lecture with Alternative Teaching Methodologies: Implementing Project-Based and Flipped Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howell, R.

    Vic Montemayor - No one has been more passionate about improving the quality and effectiveness of the teaching of Medical Physics than Bill Hendee. It was in August of 2008 that the first AAPM Workshop on Becoming a Better Teacher of Medical Physics was held, organized and run by Bill Hendee. This was followed up in July of 2010 with a summer school on the same topic, again organized by Bill. There has been continued interest in alternate approaches to teaching medical physics since those initial gatherings. The momentum established by these workshops is made clear each year in themore » annual Innovation in Medical Physics Education session, which highlights work being done in all forms of medical physics education, from one-on-one residencies or classroom presentations to large-scale program revisions and on-line resources for international audiences. This symposium, presented on behalf of the Education Council, highlights the work of three finalists from past Innovation in Education sessions. Each will be presenting their approaches to and innovations in teaching medical physics. It is hoped that audience members interested in trying something new in their teaching of medical physics will find some of these ideas and approaches readily applicable to their own classrooms. Rebecca Howell - The presentation will discuss ways to maximize classroom learning, i.e., increasing the amount of material covered while also enhancing students’ understanding of the broader implications of the course topics. Specifically, the presentation will focus on two teaching methodologies, project based learning and flip learning. These teaching methods will be illustrated using an example of graduate medical physics course where both are used in conjunction with traditional lectures. Additionally, the presentation will focus on our experience implementing these methods including challenges that were overcome. Jay Burmeister - My presentation will discuss the incorporation of active learning techniques into a traditional medical physics classroom course. I will describe these techniques and how they were implemented as well as student performance before and after implementation. Student feedback indicated that these course changes improved their ability to actively assimilate the course content, thus improving their understanding of the material. Shahid Naqvi - My talk will focus on ways to help students visualize crucial concepts that lie at the core of radiation physics. Although particle tracks generated by Monte Carlo simulations have served as an indispensable visualization tool, students often struggle to resolve the underlying physics from a simultaneous jumble of tracks. We can clarify the physics by “coding” the tracks, e.g., by coloring the tracks according to their “starting” or “crossing” regions. The regionally-coded tracks when overlaid with dose distributions help the students see the elusive connection between dose, kerma and electronic disequilibrium. Tracks coded according to local energy or energy-loss rate can illustrate the need for stopping power corrections in electron beams and explain the Bragg peak in a proton beam. Coding tracks according to parent interaction type and order can clarify the often misunderstood distinction between primary and scatter dose. The students can thus see the “whole” simultaneously with the “sum of the parts,” which enhances their physical insight and creates a sustainable foundation for further learning. After the presentations the speakers and moderator will be open to questions and discussion with the audience members. Learning Objectives: Be able to explain Project-Based Learning and how can it be incorporated into a Medical Physics classroom. Be able to explain Flipped Learning and how can it be incorporated into a Medical Physics classroom. Be able to explain active-learning strategies for the teaching of Medical Physics. Be able to explain how Monte Carlo simulations can be used to deepen a student’s understanding of radiation physics and dosimetry.« less

  13. Ophthalmologist-patient communication, self-efficacy, and glaucoma medication adherence

    PubMed Central

    Sleath, Betsy; Blalock, Susan J.; Carpenter, Delesha M.; Sayner, Robyn; Muir, Kelly W.; Slota, Catherine; Lawrence, Scott D.; Giangiacomo, Annette L.; Hartnett, Mary Elizabeth; Tudor, Gail; Goldsmith, Jason A.; Robin, Alan L.

    2015-01-01

    Objective The objective of the study was to examine the association between provider-patient communication, glaucoma medication adherence self-efficacy, outcome expectations, and glaucoma medication adherence. Design Prospective observational cohort study. Participants 279 patients with glaucoma who were newly prescribed or on glaucoma medications were recruited at six ophthalmology clinics. Methods Patients’ visits were video-tape recorded and communication variables were coded using a detailed coding tool developed by the authors. Adherence was measured using Medication Event Monitoring Systems for 60 days after their visits. Main outcome measures The following adherence variables were measured for the 60 day period after their visits: whether the patient took 80% or more of the prescribed doses, percent correct number of prescribed doses taken each day, and percent prescribed doses taken on time. Results Higher glaucoma medication adherence self-efficacy was positively associated with better adherence with all three measures. African American race was negatively associated with percent correct number of doses taken each day (beta= −0.16, p<0.05) and whether the patient took 80% or more of the prescribed doses (odds ratio=0.37, 95% confidence interval 0.16, 0.86). Physician education about how to administer drops was positively associated with percent correct number of doses taken each day (beta= 0.18, p<0.01) and percent prescribed doses taken on time (beta=0.15, p<0.05). Conclusions These findings indicate that provider education about how to administer glaucoma drops and patient glaucoma medication adherence self-efficacy are positively associated with adherence. PMID:25542521

  14. In-situ recording of ionic currents in projection neurons and Kenyon cells in the olfactory pathway of the honeybee

    PubMed Central

    Rössler, Wolfgang

    2018-01-01

    The honeybee olfactory pathway comprises an intriguing pattern of convergence and divergence: ~60.000 olfactory sensory neurons (OSN) convey olfactory information on ~900 projection neurons (PN) in the antennal lobe (AL). To transmit this information reliably, PNs employ relatively high spiking frequencies with complex patterns. PNs project via a dual olfactory pathway to the mushroom bodies (MB). This pathway comprises the medial (m-ALT) and the lateral antennal lobe tract (l-ALT). PNs from both tracts transmit information from a wide range of similar odors, but with distinct differences in coding properties. In the MBs, PNs form synapses with many Kenyon cells (KC) that encode odors in a spatially and temporally sparse way. The transformation from complex information coding to sparse coding is a well-known phenomenon in insect olfactory coding. Intrinsic neuronal properties as well as GABAergic inhibition are thought to contribute to this change in odor representation. In the present study, we identified intrinsic neuronal properties promoting coding differences between PNs and KCs using in-situ patch-clamp recordings in the intact brain. We found very prominent K+ currents in KCs clearly differing from the PN currents. This suggests that odor coding differences between PNs and KCs may be caused by differences in their specific ion channel properties. Comparison of ionic currents of m- and l-ALT PNs did not reveal any differences at a qualitative level. PMID:29351552

  15. SU-G-BRA-12: Development of An Intra-Fractional Motion Tracking and Dose Reconstruction System for Adaptive Stereotactic Body Radiation Therapy in High-Risk Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaeian, N Hassan; Chi, Y; Tian, Z

    Purpose: A clinical trial on stereotactic body radiation therapy (SBRT) for high-risk prostate cancer is undergoing at our institution. In addition to escalating dose to the prostate, we have increased dose to intra-prostatic lesions. Intra-fractional prostate motion deteriorates well planned radiation dose, especially for the small intra-prostatic lesions. To solve this problem, we have developed a motion tracking and 4D dose-reconstruction system to facilitate adaptive re-planning. Methods: Patients in the clinical trial were treated with VMAT using four arcs and 10 FFF beam. KV triggered x-ray projections were taken every 3 sec during delivery to acquire 2D projections of 3Dmore » anatomy at the direction orthogonal to the therapeutic beam. Each patient had three implanted prostate markers. Our developed system first determined 2D projection locations of these markers and then 3D prostate translation and rotation via 2D/3D registration of the markers. Using delivery log files, our GPU-based Monte Carlo tool (goMC) reconstructed dose corresponding to each triggered image. The calculated 4D dose distributions were further aggregated to yield the delivered dose. Results: We first tested each module in our system. MC dose engine were commissioned to our treatment planning system with dose difference of <0.5%. For motion tracking, 1789 kV projections from 7 patients were acquired. The 2D marker location error was <1 mm. For 3D motion tracking, root mean square (RMS) errors along LR, AP, and CC directions were 0.26mm, 0.36mm, and 0.01mm respectively in simulation studies and 1.99mm, 1.37mm, and 0.22mm in phantom studies. We also tested the entire system workflow. Our system was able to reconstruct delivered dose. Conclusion: We have developed a functional intra-fractional motion tracking and 4D dose re-construction system to support our clinical trial on adaptive high-risk prostate cancer SBRT. Comprehensive evaluations have shown the capability and accuracy of our system.« less

  16. Radiation transport codes for potential applications related to radiobiology and radiotherapy using protons, neutrons, and negatively charged pions

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.

    1972-01-01

    Several Monte Carlo radiation transport computer codes are used to predict quantities of interest in the fields of radiotherapy and radiobiology. The calculational methods are described and comparisions of calculated and experimental results are presented for dose distributions produced by protons, neutrons, and negatively charged pions. Comparisons of calculated and experimental cell survival probabilities are also presented.

  17. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kathy Held; Kevin Prise; Barry Michael

    The management of the risks of exposure of people to ionizing radiation is important in relation to its uses in industry and medicine, also to natural and man-made radiation in the environment. The vase majority of exposures are at a very low level of radiation dose. The risks are of inducing cancer in the exposed individuals and a smaller risk of inducing genetic damage that can be indicate that they are low. As a result, the risks are impossible to detect in population studies with any accuracy above the normal levels of cancer and genetic defects unless the dose levelsmore » are high. In practice, this means that our knowledge depends very largely on the information gained from the follow-up of the survivors of the atomic bombs dropped on Japanese cities. The risks calculated from these high-dose short-duration exposures then have to be projected down to the low-dose long-term exposures that apply generally. Recent research using cells in culture has revealed that the relationship between high- and low-dose biological damage may be much more complex than had previously been thought. The aims of this and other projects in the DOE's Low-Dose Program are to gain an understanding of the biological actions of low-dose radiation, ultimately to provide information that will lead to more accurate quantification of low-dose risk. Our project is based on the concept that the processes by which radiation induces cancer start where the individual tracks of radiation impact on cells and tissues. At the dose levels of most low-dose exposures, these events are rare and any individual cells only ''sees'' radiation tracks at intervals averaging from weeks to years apart. This contrasts with the atomic bomb exposures where, on average, each cell was hit by hundreds of tracks instantaneously. We have therefore developed microbeam techniques that enable us to target cells in culture with any numbers of tracks, from one upwards. This approach enables us to study the biological ha sis of the relationship between high- and low-dose exposures. The targeting approach also allows us to study very clearly a newly recognized effect of radiation, the ''bystander effect'', which appears to dominate some low-dose responses and therefore may have a significant role in low-dose risk mechanisms. Our project also addresses the concept that the background of naturally occurring oxidative damage that takes place continually in cells due to byproducts of metabolism may play a role in low-dose radiation risk. This project therefore also examines how cells are damaged by treatments that modify the levels of oxidative damage, either alone or in combination with low-dose irradiation. In this project, we have used human and rodent cell lines and each set of experiments has been carried out on a single cell type. However, low-dose research has to extend into tissues because signaling between cells of different types is likely to influence the responses. Our studies have therefore also included microbeam experiments using a model tissue system that consists of an explant of a small piece of pig ureter grown in culture. The structure of this tissue is similar to that of epithelium and therefore it relates to the tissues in which carcinoma arises. Our studies have been able to measure bystander-induced changes in the cells growing out from the tissue fragment after it has been targeted with a few radiation tracks to mimic a low-dose exposure.« less

  19. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  20. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    PubMed

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

Top