Sample records for improved phits code

  1. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  2. Applications of the microdosimetric function implemented in the macroscopic particle transport simulation code PHITS.

    PubMed

    Sato, Tatsuhiko; Watanabe, Ritsuko; Sihver, Lembit; Niita, Koji

    2012-01-01

    Microdosimetric quantities such as lineal energy are generally considered to be better indices than linear energy transfer (LET) for expressing the relative biological effectiveness (RBE) of high charge and energy particles. To calculate their probability densities (PD) in macroscopic matter, it is necessary to integrate microdosimetric tools such as track-structure simulation codes with macroscopic particle transport simulation codes. As an integration approach, the mathematical model for calculating the PD of microdosimetric quantities developed based on track-structure simulations was incorporated into the macroscopic particle transport simulation code PHITS (Particle and Heavy Ion Transport code System). The improved PHITS enables the PD in macroscopic matter to be calculated within a reasonable computation time, while taking their stochastic nature into account. The microdosimetric function of PHITS was applied to biological dose estimation for charged-particle therapy and risk estimation for astronauts. The former application was performed in combination with the microdosimetric kinetic model, while the latter employed the radiation quality factor expressed as a function of lineal energy. Owing to the unique features of the microdosimetric function, the improved PHITS has the potential to establish more sophisticated systems for radiological protection in space as well as for the treatment planning of charged-particle therapy.

  3. Monitoring Cosmic Radiation Risk: Comparisons between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-01-01

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and...radiation transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the...same dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6

  4. Monitoring Cosmic Radiation Risk: Comparisons Between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-07-05

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and Heavy...transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the input...dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6 (PARMA

  5. PHITS Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niita, K.; Matsuda, N.; Iwamoto, Y.

    The paper presents a brief description of the models incorporated in PHITS and the present status of the code, showing some benchmarking tests of the PHITS code for accelerator facilities and space radiation.

  6. Recent Improvements of Particle and Heavy Ion Transport code System: PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Iwamoto, Yosuke; Hashimoto, Shintaro; Ogawa, Tatsuhiko; Furuta, Takuya; Abe, Shin-ichiro; Kai, Takeshi; Matsuda, Norihiro; Okumura, Keisuke; Kai, Tetsuya; Iwase, Hiroshi; Sihver, Lembit

    2017-09-01

    The Particle and Heavy Ion Transport code System, PHITS, has been developed under the collaboration of several research institutes in Japan and Europe. This system can simulate the transport of most particles with energy levels up to 1 TeV (per nucleon for ion) using different nuclear reaction models and data libraries. More than 2,500 registered researchers and technicians have used this system for various applications such as accelerator design, radiation shielding and protection, medical physics, and space- and geo-sciences. This paper summarizes the physics models and functions recently implemented in PHITS, between versions 2.52 and 2.88, especially those related to source generation useful for simulating brachytherapy and internal exposures of radioisotopes.

  7. Standardizing Methods for Weapons Accuracy and Effectiveness Evaluation

    DTIC Science & Technology

    2014-06-01

    37  B.  MONTE CARLO APPROACH............................37  C.  EXPECTED VALUE THEOREM..........................38  D.  PHIT /PNM METHODOLOGY...MATLAB CODE – SR_CDF_DATA.......................96  F.  MATLAB CODE – GE_EXTRACT........................98  G.  MATLAB CODE - PHIT /PNM...Normal fit to test data.........................18  Figure 11.  Double Normal fit to test data..................19  Figure 12.  PHIT /PNM Methodology (from

  8. Incorporation of the statistical multi-fragmentation model in PHITS and its application for simulation of fragmentation by heavy ions and protons

    NASA Astrophysics Data System (ADS)

    Ogawa, Tatsuhiko; Sato, Tatsuhiko; Hashimoto, Shintaro; Niita, Koji

    2014-06-01

    The fragmentation reactions of relativistic-energy nucleus-nucleus and proton-nucleus collisions were simulated using the Statistical Multi-fragmentation Model (SMM) incorporated with the Particle and Heavy Ion Transport code System (PHITS). The comparisons of calculated cross-sections with literature data showed that PHITS-SMM predicts the fragmentation cross-sections of heavy nuclei up to two orders of magnitude more accurately than PHITS for heavy-ion-induced reactions. For proton-induced reactions, noticeable improvements are observed for interactions of the heavy target with protons at an energy greater than 1 GeV. Therefore, consideration for multi-fragmentation reactions is necessary for the accurate simulation of energetic fragmentation reactions of heavy nuclei.

  9. Microdosimetric investigation of the spectra from YAYOI by use of the Monte Carlo code PHITS.

    PubMed

    Nakao, Minoru; Baba, Hiromi; Oishi, Ayumu; Onizuka, Yoshihiko

    2010-07-01

    The purpose of this study was to obtain the neutron energy spectrum on the surface of the moderator of the Tokyo University reactor YAYOI and to investigate the origins of peaks observed in the neutron energy spectrum by use of the Monte Carlo Code PHITS for evaluating biological studies. The moderator system was modeled with the use of details from an article that reported a calculation result and a measurement result for a neutron spectrum on the surface of the moderator of the reactor. Our calculation results with PHITS were compared to those obtained with the discrete ordinate code ANISN described in the article. In addition, the changes in the neutron spectrum at the boundaries of materials in the moderator system were examined with PHITS. Also, microdosimetric energy distributions of secondary charged particles from neutron recoil or reaction were calculated by use of PHITS and compared with a microdosimetric experiment. Our calculations of the neutron energy spectrum with PHITS showed good agreement with the results of ANISN in terms of the energy and structure of the peaks. However, the microdosimetric dose distribution spectrum with PHITS showed a remarkable discrepancy with the experimental one. The experimental spectrum could not be explained by PHITS when we used neutron beams of two mono-energies.

  10. Evaluation of the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels using particle and heavy ion transport code system: PHITS.

    PubMed

    Shiiba, Takuro; Kuga, Naoya; Kuroiwa, Yasuyoshi; Sato, Tatsuhiko

    2017-10-01

    We assessed the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels (DPKs) calculated using the particle and heavy ion transport code system (PHITS) for patient-specific dosimetry in targeted radionuclide treatment (TRT) and compared our data with published data. All mono-energetic and beta-emitting isotope DPKs calculated using PHITS, both in water and compact bone, were in good agreement with those in literature using other MC codes. PHITS provided reliable mono-energetic electron and beta-emitting isotope scaled DPKs for patient-specific dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Comparison of fluence-to-dose conversion coefficients for deuterons, tritons and helions.

    PubMed

    Copeland, Kyle; Friedberg, Wallace; Sato, Tatsuhiko; Niita, Koji

    2012-02-01

    Secondary radiation in aircraft and spacecraft includes deuterons, tritons and helions. Two sets of fluence-to-effective dose conversion coefficients for isotropic exposure to these particles were compared: one used the particle and heavy ion transport code system (PHITS) radiation transport code coupled with the International Commission on Radiological Protection (ICRP) reference phantoms (PHITS-ICRP) and the other the Monte Carlo N-Particle eXtended (MCNPX) radiation transport code coupled with modified BodyBuilder™ phantoms (MCNPX-BB). Also, two sets of fluence-to-effective dose equivalent conversion coefficients calculated using the PHITS-ICRP combination were compared: one used quality factors based on linear energy transfer; the other used quality factors based on lineal energy (y). Finally, PHITS-ICRP effective dose coefficients were compared with PHITS-ICRP effective dose equivalent coefficients. The PHITS-ICRP and MCNPX-BB effective dose coefficients were similar, except at high energies, where MCNPX-BB coefficients were higher. For helions, at most energies effective dose coefficients were much greater than effective dose equivalent coefficients. For deuterons and tritons, coefficients were similar when their radiation weighting factor was set to 2.

  12. Multi-threading performance of Geant4, MCNP6, and PHITS Monte Carlo codes for tetrahedral-mesh geometry.

    PubMed

    Han, Min Cheol; Yeom, Yeon Soo; Lee, Hyun Su; Shin, Bangho; Kim, Chan Hyeong; Furuta, Takuya

    2018-05-04

    In this study, the multi-threading performance of the Geant4, MCNP6, and PHITS codes was evaluated as a function of the number of threads (N) and the complexity of the tetrahedral-mesh phantom. For this, three tetrahedral-mesh phantoms of varying complexity (simple, moderately complex, and highly complex) were prepared and implemented in the three different Monte Carlo codes, in photon and neutron transport simulations. Subsequently, for each case, the initialization time, calculation time, and memory usage were measured as a function of the number of threads used in the simulation. It was found that for all codes, the initialization time significantly increased with the complexity of the phantom, but not with the number of threads. Geant4 exhibited much longer initialization time than the other codes, especially for the complex phantom (MRCP). The improvement of computation speed due to the use of a multi-threaded code was calculated as the speed-up factor, the ratio of the computation speed on a multi-threaded code to the computation speed on a single-threaded code. Geant4 showed the best multi-threading performance among the codes considered in this study, with the speed-up factor almost linearly increasing with the number of threads, reaching ~30 when N  =  40. PHITS and MCNP6 showed a much smaller increase of the speed-up factor with the number of threads. For PHITS, the speed-up factors were low when N  =  40. For MCNP6, the increase of the speed-up factors was better, but they were still less than ~10 when N  =  40. As for memory usage, Geant4 was found to use more memory than the other codes. In addition, compared to that of the other codes, the memory usage of Geant4 more rapidly increased with the number of threads, reaching as high as ~74 GB when N  =  40 for the complex phantom (MRCP). It is notable that compared to that of the other codes, the memory usage of PHITS was much lower, regardless of both the complexity of the phantom and the number of threads, hardly increasing with the number of threads for the MRCP.

  13. Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.

    PubMed

    Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko

    2016-01-01

    DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.

  14. Microdosimetric evaluation of the neutron field for BNCT at Kyoto University reactor by using the PHITS code.

    PubMed

    Baba, H; Onizuka, Y; Nakao, M; Fukahori, M; Sato, T; Sakurai, Y; Tanaka, H; Endo, S

    2011-02-01

    In this study, microdosimetric energy distributions of secondary charged particles from the (10)B(n,α)(7)Li reaction in boron-neutron capture therapy (BNCT) field were calculated using the Particle and Heavy Ion Transport code System (PHITS). The PHITS simulation was performed to reproduce the geometrical set-up of an experiment that measured the microdosimetric energy distributions at the Kyoto University Reactor where two types of tissue-equivalent proportional counters were used, one with A-150 wall alone and another with a 50-ppm-boron-loaded A-150 wall. It was found that the PHITS code is a useful tool for the simulation of the energy deposited in tissue in BNCT based on the comparisons with experimental results.

  15. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  16. Implementing displacement damage calculations for electrons and gamma rays in the Particle and Heavy-Ion Transport code System

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke

    2018-03-01

    In this study, the Monte Carlo displacement damage calculation method in the Particle and Heavy-Ion Transport code System (PHITS) was improved to calculate displacements per atom (DPA) values due to irradiation by electrons (or positrons) and gamma rays. For the damage due to electrons and gamma rays, PHITS simulates electromagnetic cascades using the Electron Gamma Shower version 5 (EGS5) algorithm and calculates DPA values using the recoil energies and the McKinley-Feshbach cross section. A comparison of DPA values calculated by PHITS and the Monte Carlo assisted Classical Method (MCCM) reveals that they were in good agreement for gamma-ray irradiations of silicon and iron at energies that were less than 10 MeV. Above 10 MeV, PHITS can calculate DPA values not only for electrons but also for charged particles produced by photonuclear reactions. In DPA depth distributions under electron and gamma-ray irradiations, build-up effects can be observed near the target's surface. For irradiation of 90-cm-thick carbon by protons with energies of more than 30 GeV, the ratio of the secondary electron DPA values to the total DPA values is more than 10% and increases with an increase in incident energy. In summary, PHITS can calculate DPA values for all particles and materials over a wide energy range between 1 keV and 1 TeV for electrons, gamma rays, and charged particles and between 10-5 eV and 1 TeV for neutrons.

  17. Biological dose estimation for charged-particle therapy using an improved PHITS code coupled with a microdosimetric kinetic model.

    PubMed

    Sato, Tatsuhiko; Kase, Yuki; Watanabe, Ritsuko; Niita, Koji; Sihver, Lembit

    2009-01-01

    Microdosimetric quantities such as lineal energy, y, are better indexes for expressing the RBE of HZE particles in comparison to LET. However, the use of microdosimetric quantities in computational dosimetry is severely limited because of the difficulty in calculating their probability densities in macroscopic matter. We therefore improved the particle transport simulation code PHITS, providing it with the capability of estimating the microdosimetric probability densities in a macroscopic framework by incorporating a mathematical function that can instantaneously calculate the probability densities around the trajectory of HZE particles with a precision equivalent to that of a microscopic track-structure simulation. A new method for estimating biological dose, the product of physical dose and RBE, from charged-particle therapy was established using the improved PHITS coupled with a microdosimetric kinetic model. The accuracy of the biological dose estimated by this method was tested by comparing the calculated physical doses and RBE values with the corresponding data measured in a slab phantom irradiated with several kinds of HZE particles. The simulation technique established in this study will help to optimize the treatment planning of charged-particle therapy, thereby maximizing the therapeutic effect on tumors while minimizing unintended harmful effects on surrounding normal tissues.

  18. An Update of Recent Phits Code

    NASA Astrophysics Data System (ADS)

    Sihver, Lembit; Sato, Tatsuhiko; Niita, Koji; Iwase, Hiroshi; Iwamoto, Yosuke; Matsuda, Norihiro; Nakashima, Hiroshi; Sakamoto, Yukio; Gustafsson, Katarina; Mancusi, Davide

    We will first present the current status of the General-Purpose Particle and Heavy-Ion Transport code System (PHITS). In particular, we will describe benchmarking of calculated cross sections against measurements; we will introduce a relativistically covariant version of JQMD, called R- JQMD, that features an improved ground-state initialization algorithm, and we will show heavyion charge-changing cross sections simulated with R-JQMD and compare them to experimental data and to results predicted by the JQMD model. We will also show calculations of dose received by aircrews and personnel in space from cosmic radiation. In recent years, many countries have issued regulations or recommendations to set annual dose limitations for aircrews. Since estimation of cosmic-ray spectra in the atmosphere is an essential issue for the evaluation of aviation doses we have calculated these spectra using PHITS. The accuracy of the simulation, which has well been verified by experimental data taken under various conditions, will be presented together with a software called EXPACS-V, that can visualize the cosmic-ray dose rates at ground level or at a certain altitude on the map of Google Earth, using the PHITS based Analytical Radiation Model in the Atmosphere (PARMA). PARMA can instantaneously calculate the cosmic-ray spectra anywhere in the world by specifying the atmospheric depth, the vertical cut-off rigidity and the force-field potential. For the purpose of examining the applicability of PHITS to the shielding design in space, the absorbed doses in a tissue equivalent water phantom inside an imaginary space vessel has been estimated for different shielding materials of different thicknesses. The results confirm previous results which indicate that PHITS is a suitable tool when performing shielding design studies of spacecrafts. Finally we have used PHITS for the calculations of depth-dose distributions in MATROSHKA, which is an ESA project dedicated to determining the radiation load on astronauts within and outside the International Space Station (ISS).

  19. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  20. [Features of PHITS and its application to medical physics].

    PubMed

    Hashimoto, Shintaro; Niita, Koji; Matsuda, Norihiro; Iwamoto, Yosuke; Iwase, Hiroshi; Sato, Tatsuhiko; Noda, Shusaku; Ogawa, Tatsuhiko; Nakashima, Hiroshi; Fukahori, Tokio; Furuta, Takuya; Chiba, Satoshi

    2013-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code to analyze the transport in three-dimensional phase space and collisions of nearly all particles, including heavy ions, over wide energy range up to 100 GeV/u. Various quantities, such as particle fluence and deposition energies in materials, can be deduced using estimator functions "tally". Recently, a microdosimetric tally function was also developed to apply PHITS to medical physics. Owing to these features, PHITS has been used for medical applications, such as radiation therapy and protection.

  1. Simulations of an accelerator-based shielding experiment using the particle and heavy-ion transport code system PHITS.

    PubMed

    Sato, T; Sihver, L; Iwase, H; Nakashima, H; Niita, K

    2005-01-01

    In order to estimate the biological effects of HZE particles, an accurate knowledge of the physics of interaction of HZE particles is necessary. Since the heavy ion transport problem is a complex one, there is a need for both experimental and theoretical studies to develop accurate transport models. RIST and JAERI (Japan), GSI (Germany) and Chalmers (Sweden) are therefore currently developing and bench marking the General-Purpose Particle and Heavy-Ion Transport code System (PHITS), which is based on the NMTC and MCNP for nucleon/meson and neutron transport respectively, and the JAM hadron cascade model. PHITS uses JAERI Quantum Molecular Dynamics (JQMD) and the Generalized Evaporation Model (GEM) for calculations of fission and evaporation processes, a model developed at NASA Langley for calculation of total reaction cross sections, and the SPAR model for stopping power calculations. The future development of PHITS includes better parameterization in the JQMD model used for the nucleus-nucleus reactions, and improvement of the models used for calculating total reaction cross sections, and addition of routines for calculating elastic scattering of heavy ions, and inclusion of radioactivity and burn up processes. As a part of an extensive bench marking of PHITS, we have compared energy spectra of secondary neutrons created by reactions of HZE particles with different targets, with thicknesses ranging from <1 to 200 cm. We have also compared simulated and measured spatial, fluence and depth-dose distributions from different high energy heavy ion reactions. In this paper, we report simulations of an accelerator-based shielding experiment, in which a beam of 1 GeV/n Fe-ions has passed through thin slabs of polyethylene, Al, and Pb at an acceptance angle up to 4 degrees. c2005 Published by Elsevier Ltd on behalf of COSPAR.

  2. Development of a new multi-modal Monte-Carlo radiotherapy planning system.

    PubMed

    Kumada, H; Nakamura, T; Komeda, M; Matsumura, A

    2009-07-01

    A new multi-modal Monte-Carlo radiotherapy planning system (developing code: JCDS-FX) is under development at Japan Atomic Energy Agency. This system builds on fundamental technologies of JCDS applied to actual boron neutron capture therapy (BNCT) trials in JRR-4. One of features of the JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multi-purpose particle Monte-Carlo transport code. Hence application of PHITS enables to evaluate total doses given to a patient by a combined modality therapy. Moreover, JCDS-FX with PHITS can be used for the study of accelerator based BNCT. To verify calculation accuracy of the JCDS-FX, dose evaluations for neutron irradiation of a cylindrical water phantom and for an actual clinical trial were performed, then the results were compared with calculations by JCDS with MCNP. The verification results demonstrated that JCDS-FX is applicable to BNCT treatment planning in practical use.

  3. Development of a multi-modal Monte-Carlo radiation treatment planning system combined with PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumada, Hiroaki; Nakamura, Takemi; Komeda, Masao

    A new multi-modal Monte-Carlo radiation treatment planning system is under development at Japan Atomic Energy Agency. This system (developing code: JCDS-FX) builds on fundamental technologies of JCDS. JCDS was developed by JAEA to perform treatment planning of boron neutron capture therapy (BNCT) which is being conducted at JRR-4 in JAEA. JCDS has many advantages based on practical accomplishments for actual clinical trials of BNCT at JRR-4, the advantages have been taken over to JCDS-FX. One of the features of JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multipurpose particle Monte-Carlo transport code, thus applicationmore » of PHITS enables to evaluate doses for not only BNCT but also several radiotherapies like proton therapy. To verify calculation accuracy of JCDS-FX with PHITS for BNCT, treatment planning of an actual BNCT conducted at JRR-4 was performed retrospectively. The verification results demonstrated the new system was applicable to BNCT clinical trials in practical use. In framework of R and D for laser-driven proton therapy, we begin study for application of JCDS-FX combined with PHITS to proton therapy in addition to BNCT. Several features and performances of the new multimodal Monte-Carlo radiotherapy planning system are presented.« less

  4. The PHITS code for space applications: status and recent developments

    NASA Astrophysics Data System (ADS)

    Sihver, Lembit; Ploc, Ondrej; Sato, Tatsuhiko; Niita, Koji; Hashimoto, Shintaro; El-Jaby, Samy

    Since COSPAR 2012, the Particle and Heavy Ion Transport code System, PHITS, has been upgraded and released to the public [1]. The code has been improved and so has the contents of its package, such as the attached data libraries. In the new version, the intra-nuclear cascade models INCL4.6 and INC-ELF have been implemented as well as the Kurotama model for the total reaction cross sections. The accuracies of the new reaction models for transporting the galactic cosmic-rays were investigated by comparing with experimental data. The incorporation of these models has improved the capabilities of PHITS to perform particle transport simulations for different space applications. A methodology for assessing the pre-mission exposure of space crew aboard the ISS has been developed in terms of an effective dose equivalent [2]. PHITS was used to calculate the particle transport of the GCR and trapped radiation through the hull of the ISS. By using the predicted spectra, and fluence-to-dose conversion factors, the semi-empirical ISSCREM [3,4,5] code was then scaled to predict the effective dose equivalent. This methodology provides an opportunity for pre-flight predictions of the effective dose equivalent, which can be compared to post-flight estimates, and therefore offers a means to assess the impact of radiation exposure on ISS flight crew. We have also simulated [6] the protective curtain experiment, which was performed to test the efficiency of water-soaked hygienic tissue wipes and towels as a simple and cost-effective additional spacecraft shielding. The dose from the trapped particles and low energetic GCR, was significantly reduced, which shows that the protective curtains are efficient when they are applied on spacecraft at LEO. The results of these benchmark calculations, as well as the mentioned applications of PHITS to space dosimetry, will be presented. [1] T. Sato et al. J. Nucl. Sci. Technol. 50, 913-923 (2013). [2] S. El-Jaby, et al. Adv. Space Res. doi: http://dx.doi.org/10.1016/j.asr.2013.12.022 (2013). [3] S. El-Jaby, et al. Adv. Space Res. doi: http://dx.doi.org/10.1016/j.asr.2013.10.006 (2013). [4] S. El-Jaby, et al. In proc. to the IEEE Aerospace Conference, Big Sky, MN, USA (2013). [5] S. El-Jaby, PhD Thesis, Royal Military College of Canada (2012). [6] O. Ploc, et al., Adv. Space Res. 52, 1911-1918 (2013).

  5. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for recoil cross section spectra under neutron irradiation

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke; Ogawa, Tatsuhiko

    2017-04-01

    Because primary knock-on atoms (PKAs) create point defects and clusters in materials that are irradiated with neutrons, it is important to validate the calculations of recoil cross section spectra that are used to estimate radiation damage in materials. Here, the recoil cross section spectra of fission- and fusion-relevant materials were calculated using the Event Generator Mode (EGM) of the Particle and Heavy Ion Transport code System (PHITS) and also using the data processing code NJOY2012 with the nuclear data libraries TENDL2015, ENDF/BVII.1, and JEFF3.2. The heating number, which is the integral of the recoil cross section spectra, was also calculated using PHITS-EGM and compared with data extracted from the ACE files of TENDL2015, ENDF/BVII.1, and JENDL4.0. In general, only a small difference was found between the PKA spectra of PHITS + TENDL2015 and NJOY + TENDL2015. From analyzing the recoil cross section spectra extracted from the nuclear data libraries using NJOY2012, we found that the recoil cross section spectra were incorrect for 72Ge, 75As, 89Y, and 109Ag in the ENDF/B-VII.1 library, and for 90Zr and 55Mn in the JEFF3.2 library. From analyzing the heating number, we found that the data extracted from the ACE file of TENDL2015 for all nuclides were problematic in the neutron capture region because of incorrect data regarding the emitted gamma energy. However, PHITS + TENDL2015 can calculate PKA spectra and heating numbers correctly.

  6. Energy deposition calculated by PHITS code in Pb spallation target

    NASA Astrophysics Data System (ADS)

    Yu, Quanzhi

    2016-01-01

    Energy deposition in a Pb spallation target irradiated by high energetic protons was calculated by PHITS2.52 code. The validation of the energy deposition and neutron production calculated by PHITS code was performed. Results show good agreements between the simulation results and the experimental data. Detailed comparison shows that for the total energy deposition, PHITS simulation result was about 15% overestimation than that of the experimental data. For the energy deposition along the length of the Pb target, the discrepancy mainly presented at the front part of the Pb target. Calculation indicates that most of the energy deposition comes from the ionizations of the primary protons and the produced secondary particles. With the event generator mode of PHITS, the deposit energy distribution for the particles and the light nulclei is presented for the first time. It indicates that the primary protons with energy more than 100 MeV are the most contributors to the total energy deposition. The energy depositions peaking at 10 MeV and 0.1 MeV, are mainly caused by the electrons, pions, d, t, 3He and also α particles during the cascade process and the evaporation process, respectively. The energy deposition density caused by different proton beam profiles are also calculated and compared. Such calculation and analyses are much helpful for better understanding the physical mechanism of energy deposition in the spallation target, and greatly useful for the thermal hydraulic design of the spallation target.

  7. Depth dependency of neutron density produced by cosmic rays in the lunar subsurface

    NASA Astrophysics Data System (ADS)

    Ota, S.; Sihver, L.; Kobayashi, S.; Hasebe, N.

    2014-11-01

    Depth dependency of neutrons produced by cosmic rays (CRs) in the lunar subsurface was estimated using the three-dimensional Monte Carlo particle and heavy ion transport simulation code, PHITS, incorporating the latest high energy nuclear data, JENDL/HE-2007. The PHITS simulations of equilibrium neutron density profiles in the lunar subsurface were compared with the measurement by Apollo 17 Lunar Neutron Probe Experiment (LNPE). Our calculations reproduced the LNPE data except for the 350-400 mg/cm2 region under the improved condition using the CR spectra model based on the latest observations, well-tested nuclear interaction models with systematic cross section data, and JENDL/HE-2007.

  8. Analysis of multi-fragmentation reactions induced by relativistic heavy ions using the statistical multi-fragmentation model

    NASA Astrophysics Data System (ADS)

    Ogawa, T.; Sato, T.; Hashimoto, S.; Niita, K.

    2013-09-01

    The fragmentation cross-sections of relativistic energy nucleus-nucleus collisions were analyzed using the statistical multi-fragmentation model (SMM) incorporated with the Monte-Carlo radiation transport simulation code particle and heavy ion transport code system (PHITS). Comparison with the literature data showed that PHITS-SMM reproduces fragmentation cross-sections of heavy nuclei at relativistic energies better than the original PHITS by up to two orders of magnitude. It was also found that SMM does not degrade the neutron production cross-sections in heavy ion collisions or the fragmentation cross-sections of light nuclei, for which SMM has not been benchmarked. Therefore, SMM is a robust model that can supplement conventional nucleus-nucleus reaction models, enabling more accurate prediction of fragmentation cross-sections.

  9. GEANT4 benchmark with MCNPX and PHITS for activation of concrete

    NASA Astrophysics Data System (ADS)

    Tesse, Robin; Stichelbaut, Frédéric; Pauly, Nicolas; Dubus, Alain; Derrien, Jonathan

    2018-02-01

    The activation of concrete is a real problem from the point of view of waste management. Because of the complexity of the issue, Monte Carlo (MC) codes have become an essential tool to its study. But various codes or even nuclear models exist in MC. MCNPX and PHITS have already been validated for shielding studies but GEANT4 is also a suitable solution. In these codes, different models can be considered for a concrete activation study. The Bertini model is not the best model for spallation while BIC and INCL model agrees well with previous results in literature.

  10. Study on radiation production in the charge stripping section of the RISP linear accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Oranj, Leila Mokhtari; Lee, Hee-Seock; Ko, Seung-Kook

    2015-02-01

    The linear accelerator of the Rare Isotope Science Project (RISP) accelerates 200 MeV/nucleon 238U ions in a multi-charge states. Many kinds of radiations are generated while the primary beam is transported along the beam line. The stripping process using thin carbon foil leads to complicated radiation environments at the 90-degree bending section. The charge distribution of 238U ions after the carbon charge stripper was calculated by using the LISE++ program. The estimates of the radiation environments were carried out by using the well-proved Monte Carlo codes PHITS and FLUKA. The tracks of 238U ions in various charge states were identified using the magnetic field subroutine of the PHITS code. The dose distribution caused by U beam losses for those tracks was obtained over the accelerator tunnel. A modified calculation was applied for tracking the multi-charged U beams because the fundamental idea of PHITS and FLUKA was to transport fully-ionized ion beam. In this study, the beam loss pattern after a stripping section was observed, and the radiation production by heavy ions was studied. Finally, the performance of the PHITS and the FLUKA codes was validated for estimating the radiation production at the stripping section by applying a modified method.

  11. Benchmark of neutron production cross sections with Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Tsai, Pi-En; Lai, Bo-Lun; Heilbronn, Lawrence H.; Sheu, Rong-Jiun

    2018-02-01

    Aiming to provide critical information in the fields of heavy ion therapy, radiation shielding in space, and facility design for heavy-ion research accelerators, the physics models in three Monte Carlo simulation codes - PHITS, FLUKA, and MCNP6, were systematically benchmarked with comparisons to fifteen sets of experimental data for neutron production cross sections, which include various combinations of 12C, 20Ne, 40Ar, 84Kr and 132Xe projectiles and natLi, natC, natAl, natCu, and natPb target nuclides at incident energies between 135 MeV/nucleon and 600 MeV/nucleon. For neutron energies above 60% of the specific projectile energy per nucleon, the LAQGMS03.03 in MCNP6, the JQMD/JQMD-2.0 in PHITS, and the RQMD-2.4 in FLUKA all show a better agreement with data in heavy-projectile systems than with light-projectile systems, suggesting that the collective properties of projectile nuclei and nucleon interactions in the nucleus should be considered for light projectiles. For intermediate-energy neutrons whose energies are below the 60% projectile energy per nucleon and above 20 MeV, FLUKA is likely to overestimate the secondary neutron production, while MCNP6 tends towards underestimation. PHITS with JQMD shows a mild tendency for underestimation, but the JQMD-2.0 model with a modified physics description for central collisions generally improves the agreement between data and calculations. For low-energy neutrons (below 20 MeV), which are dominated by the evaporation mechanism, PHITS (which uses GEM linked with JQMD and JQMD-2.0) and FLUKA both tend to overestimate the production cross section, whereas MCNP6 tends to underestimate more systems than to overestimate. For total neutron production cross sections, the trends of the benchmark results over the entire energy range are similar to the trends seen in the dominate energy region. Also, the comparison of GEM coupled with either JQMD or JQMD-2.0 in the PHITS code indicates that the model used to describe the first stage of a nucleus-nucleus collision also affects the low-energy neutron production. Thus, in this case, a proper combination of two physics models is desired to reproduce the measured results. In addition, code users should be aware that certain models consistently produce secondary neutrons within a constant fraction of another model in certain energy regions, which might be correlated to different physics treatments in different models.

  12. JASMIN: Japanese-American study of muon interactions and neutron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.

    Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of amore » code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.« less

  13. An intervention for enhancing public health crisis response willingness among local health department workers: a qualitative programmatic analysis.

    PubMed

    Harrison, Krista L; Errett, Nicole A; Rutkow, Lainie; Thompson, Carol B; Anderson, Marilyn K; Ferrell, Justin L; Freiheit, Jennifer M; Hudson, Robert; Koch, Michelle M; McKee, Mary; Mejia-Echeverry, Alvaro; Spitzer, James B; Storey, Doug; Barnett, Daniel J

    2014-01-01

    This study evaluated the impact of a novel multimethod curricular intervention using a train-the-trainer model: the Public Health Infrastructure Training (PHIT). PHIT was designed to 1) modify perceptions of self-efficacy, response efficacy, and threat related to specific hazards and 2) improve the willingness of local health department (LHD) workers to report to duty when called upon. Between June 2009 and October 2010, eight clusters of US LHDs (n = 49) received PHIT. Two rounds of focus groups at each intervention site were used to evaluate PHIT. The first round of focus groups included separate sessions for trainers and trainees, 3 weeks after PHIT. The second round of focus groups combined trainers and trainees in a single group at each site 6 months following PHIT. During the second focus group round, participants were asked to self-assess their preparedness before and after PHIT implementation. Focus groups were conducted at eight geographically representative clusters of LHDs. Focus group participants included PHIT trainers and PHIT trainees within each LHD cluster. Focus groups were used to assess attitudes toward the curricular intervention and modifications of willingness to respond (WTR) to an emergency; self-efficacy; and response efficacy. Participants reported that despite challenges in administering the training, PHIT was well designed and appropriate for multiple management levels and disciplines. Positive mean changes were observed for all nine self-rated preparedness factors (p < 0.001). The findings show PHIT's benefit in improving self-efficacy and WTR among participants. The PHIT has the potential to enhance emergency response willingness and related self-efficacy among LHD workers.

  14. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes

    NASA Astrophysics Data System (ADS)

    Aghara, S. K.; Sriprisan, S. I.; Singleterry, R. C.; Sato, T.

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm2 Al shield followed by 30 g/cm2 of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E < 100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results.

  15. PHITS-2.76, Particle and Heavy Ion Transport code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-08-01

    Version 03 PHITS can deal with the transport of almost all particles (nucleons, nuclei, mesons, photons, and electrons) over wide energy ranges, using several nuclear reaction models and nuclear data libraries. Geometrical configuration of the simulation can be set with GG (General Geometry) or CG (Combinatorial Geometry). Various quantities such as heat deposition, track length and production yields can be deduced from the simulation, using implemented estimator functions called "tally". The code also has a function to draw 2D and 3D figures of the calculated results as well as the setup geometries, using a code ANGEL. The physical processes includedmore » in PHITS can be divided into two categories, transport process and collision process. In the transport process, PHITS can simulate motion of particles under external fields such as magnetic and gravity. Without the external fields, neutral particles move along a straight trajectory with constant energy up to the next collision point. However, charge particles interact many times with electrons in the material losing energy and changing direction. PHITS treats ionization processes not as collision but as a transport process, using the continuous-slowing-down approximation. The average stopping power is given by the charge density of the material and the momentum of the particle taking into account the fluctuations of the energy loss and the angular deviation. In the collision process, PHITS can simulate the elastic and inelastic interactions as well as decay of particles. The total reaction cross section, or the life time of the particle is an essential quantity in the determination of the mean free path of the transport particle. According to the mean free path, PHITS chooses the next collision point using the Monte Carlo method. To generate the secondary particles of the collision, we need the information of the final states of the collision. For neutron induced reactions in low energy region, PHITS employs the cross sections from evaluated nuclear data libraries JENDL-4.0 (Shibata et al 2011). For high energy neutrons and other particles, we have incorporated several models such as JAM (Nara et al 1999), INCL (Cugnon et al 2011), INCL-ELF (Sawada et al 2012) and JQMD (Niita et al 1995) to simulate nuclear reactions up to 100 GeV/u. The special features of PHITS are the event generator mode (Iwamoto et al 2007) and the microdosimetric function (Sato et al 2009). Owing to the event generator mode, PHITS can determine the profiles of all secondary particles generated from a single nuclear interaction even using nuclear data libraries, taking the momentum and energy conservations into account. The microdosimetric function gives the probability densities of deposition energy in microscopic sites such as lineal energy y and specific energy z, using the mathematical model developed based on the results of the track structure simulation. These features are very important for various purposes such as the estimations of soft-error rates of semi-conductor devices induced by neutrons, and relative biological effectiveness of charged particles. From version 2.64, Prompt gamma spectrum and isomer production rates can be precisely estimated, owing to the implementation of EBITEM (ENSDF-Based Isomeric Transition and isomEr production Model). The photo-nuclear reaction model was improved up to 140 MeV. From version 2.76, electron and photon transport algorithm based on EGS5 (Hirayama et al. 2005) was incorporated. Models for describing photo-nuclear reaction above 140 MeV and muon-nuclear reaction were implemented. Event-generator mode version 2 was developed. Relativistic theory can be considered in the JQMD model.« less

  16. Study on detection geometry and detector shielding for portable PGNAA system using PHITS

    NASA Astrophysics Data System (ADS)

    Ithnin, H.; Dahing, L. N. S.; Lip, N. M.; Rashid, I. Q. Abd; Mohamad, E. J.

    2018-01-01

    Prompt gamma-ray neutron activation analysis (PGNAA) measurements require efficient detectors for gamma-ray detection. Apart from experimental studies, the Monte Carlo (MC) method has become one of the most popular tools in detector studies. The absolute efficiency for a 2 × 2 inch cylindrical Sodium Iodide (NaI) detector has been modelled using the PHITS software and compared with previous studies in literature. In the present work, PHITS code is used for optimization of portable PGNAA system using the validated NaI detector. The detection geometry is optimized by moving the detector along the sample to find the highest intensity of the prompt gamma generated from the sample. Shielding material for the validated NaI detector is also studied to find the best option for the PGNAA system setup. The result shows the optimum distance for detector is on the surface of the sample and around 15 cm from the source. The results specify that this process can be followed to determine the best setup for PGNAA system for a different sample size and detector type. It can be concluded that data from PHITS code is a strong tool not only for efficiency studies but also for optimization of PGNAA system.

  17. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes.

    PubMed

    Aghara, S K; Sriprisan, S I; Singleterry, R C; Sato, T

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm(2) Al shield followed by 30 g/cm(2) of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E<100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  18. Distributions of neutron yields and doses around a water phantom bombarded with 290-MeV/nucleon and 430-MeV/nucleon carbon ions

    NASA Astrophysics Data System (ADS)

    Satoh, D.; Kajimoto, T.; Shigyo, N.; Itashiki, Y.; Imabayashi, Y.; Koba, Y.; Matsufuji, N.; Sanami, T.; Nakao, N.; Uozumi, Y.

    2016-11-01

    Double-differential neutron yields from a water phantom bombarded with 290-MeV/nucleon and 430-MeV/nucleon carbon ions were measured at emission angles of 15°, 30°, 45°, 60°, 75°, and 90°, and angular distributions of neutron yields and doses around the phantom were obtained. The experimental data were compared with results of the Monte-Carlo simulation code PHITS. The PHITS results showed good agreement with the measured data. On the basis of the PHITS simulation, we estimated the angular distributions of neutron yields and doses from 0° to 180° including thermal neutrons.

  19. Radial dependence of lineal energy distribution of 290-MeV/u carbon and 500-MeV/u iron ion beams using a wall-less tissue-equivalent proportional counter

    PubMed Central

    Tsuda, Shuichi; Sato, Tatsuhiko; Watanabe, Ritsuko; Takada, Masashi

    2015-01-01

    Using a wall-less tissue-equivalent proportional counter for a 0.72-μm site in tissue, we measured the radial dependence of the lineal energy distribution, yf(y), of 290-MeV/u carbon ions and 500-MeV/u iron ion beams. The measured yf(y) distributions and the dose-mean of y, y¯D, were compared with calculations performed with the track structure simulation code TRACION and the microdosimetric function of the Particle and Heavy Ion Transport code System (PHITS). The values of the measured y¯D were consistent with calculated results within an error of 2%, but differences in the shape of yf(y) were observed for iron ion irradiation. This result indicates that further improvement of the calculation model for yf(y) distribution in PHITS is needed for the analytical function that describes energy deposition by delta rays, particularly for primary ions having linear energy transfer in excess of a few hundred keV μm−1. PMID:25210053

  20. Comparison between calculation and measured data on secondary neutron energy spectra by heavy ion reactions from different thick targets.

    PubMed

    Iwase, H; Wiegel, B; Fehrenbacher, G; Schardt, D; Nakamura, T; Niita, K; Radon, T

    2005-01-01

    Measured neutron energy fluences from high-energy heavy ion reactions through targets several centimeters to several hundred centimeters thick were compared with calculations made using the recently developed general-purpose particle and heavy ion transport code system (PHITS). It was confirmed that the PHITS represented neutron production by heavy ion reactions and neutron transport in thick shielding with good overall accuracy.

  1. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  2. Benchmarking of Heavy Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  3. Annual Systems Engineering Conference: Focusing on Improving Performance of Defense Systems Programs (10th). Volume 3. Thursday Presentations

    DTIC Science & Technology

    2007-10-25

    the Phit <.0001 requirement) restricts tactical delivery conditions, the probability of a fragment hit may be further qualified by considering only...Pkill – UK uses “self damage” metric • Risk Analysis: “If the above procedures ( Phit or Pkill <.0001) still result in restricting tactical delivery...10 (From NAWCWD Briefing) 4 Safe Escape Analysis Requirements Calculate Phit ,Pkill, and Pdet Is Phit <= .0001 for all launch conditions Done NO YES

  4. Study of an External Neutron Source for an Accelerator-Driven System using the PHITS Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugawara, Takanori; Iwasaki, Tomohiko; Chiba, Takashi

    A code system for the Accelerator Driven System (ADS) has been under development for analyzing dynamic behaviors of a subcritical core coupled with an accelerator. This code system named DSE (Dynamics calculation code system for a Subcritical system with an External neutron source) consists of an accelerator part and a reactor part. The accelerator part employs a database, which is calculated by using PHITS, for investigating the effect related to the accelerator such as the changes of beam energy, beam diameter, void generation, and target level. This analysis method using the database may introduce some errors into dynamics calculations sincemore » the neutron source data derived from the database has some errors in fitting or interpolating procedures. In this study, the effects of various events are investigated to confirm that the method based on the database is appropriate.« less

  5. Estimating neutron dose equivalent rates from heavy ion reactions around 10 MeV amu(-1) using the PHITS code.

    PubMed

    Iwamoto, Yosuke; Ronningen, R M; Niita, Koji

    2010-04-01

    It has been sometimes necessary for personnel to work in areas where low-energy heavy ions interact with targets or with beam transport equipment and thereby produce significant levels of radiation. Methods to predict doses and to assist shielding design are desirable. The Particle and Heavy Ion Transport code System (PHITS) has been typically used to predict radiation levels around high-energy (above 100 MeV amu(-1)) heavy ion accelerator facilities. However, predictions by PHITS of radiation levels around low-energy (around 10 MeV amu(-1)) heavy ion facilities to our knowledge have not yet been investigated. The influence of the "switching time" in PHITS calculations of low-energy heavy ion reactions, defined as the time when the JAERI Quantum Molecular Dynamics model (JQMD) calculation stops and the Generalized Evaporation Model (GEM) calculation begins, was studied using neutron energy spectra from 6.25 MeV amu(-1) and 10 MeV amu(-1) (12)C ions and 10 MeV amu(-1) (16)O ions incident on a copper target. Using a value of 100 fm c(-1) for the switching time, calculated neutron energy spectra obtained agree well with the experimental data. PHITS was then used with the switching time of 100 fm c(-1) to simulate an experimental study by Ohnesorge et al. by calculating neutron dose equivalent rates produced by 3 MeV amu(-1) to 16 MeV amu(-1) (12)C, (14)N, (16)O, and (20)Ne beams incident on iron, nickel and copper targets. The calculated neutron dose equivalent rates agree very well with the data and follow a general pattern which appears to be insensitive to the heavy ion species but is sensitive to the target material.

  6. Nuclear Reaction Models Responsible for Simulation of Neutron-induced Soft Errors in Microelectronics

    NASA Astrophysics Data System (ADS)

    Watanabe, Y.; Abe, S.

    2014-06-01

    Terrestrial neutron-induced soft errors in MOSFETs from a 65 nm down to a 25 nm design rule are analyzed by means of multi-scale Monte Carlo simulation using the PHITS-HyENEXSS code system. Nuclear reaction models implemented in PHITS code are validated by comparisons with experimental data. From the analysis of calculated soft error rates, it is clarified that secondary He and H ions provide a major impact on soft errors with decreasing critical charge. It is also found that the high energy component from 10 MeV up to several hundreds of MeV in secondary cosmic-ray neutrons has the most significant source of soft errors regardless of design rule.

  7. Dose estimation for astronauts using dose conversion coefficients calculated with the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Sihver, Lembit; Niita, Koji

    2011-03-01

    Absorbed-dose and dose-equivalent rates for astronauts were estimated by multiplying fluence-to-dose conversion coefficients in the units of Gy.cm(2) and Sv.cm(2), respectively, and cosmic-ray fluxes around spacecrafts in the unit of cm(-2) s(-1). The dose conversion coefficients employed in the calculation were evaluated using the general-purpose particle and heavy ion transport code system PHITS coupled to the male and female adult reference computational phantoms, which were released as a common ICRP/ICRU publication. The cosmic-ray fluxes inside and near to spacecrafts were also calculated by PHITS, using simplified geometries. The accuracy of the obtained absorbed-dose and dose-equivalent rates was verified by various experimental data measured both inside and outside spacecrafts. The calculations quantitatively show that the effective doses for astronauts are significantly greater than their corresponding effective dose equivalents, because of the numerical incompatibility between the radiation quality factors and the radiation weighting factors. These results demonstrate the usefulness of dose conversion coefficients in space dosimetry. © Springer-Verlag 2010

  8. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  9. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β+-emitting nuclei during therapeutic particle irradiation to measured data

    NASA Astrophysics Data System (ADS)

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-01

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β+-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β+-activity and dose is not feasible, a simulation of the expected β+-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β+-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β+-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  10. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  11. Radial dependence of lineal energy distribution of 290-MeV/u carbon and 500-MeV/u iron ion beams using a wall-less tissue-equivalent proportional counter.

    PubMed

    Tsuda, Shuichi; Sato, Tatsuhiko; Watanabe, Ritsuko; Takada, Masashi

    2015-01-01

    Using a wall-less tissue-equivalent proportional counter for a 0.72-μm site in tissue, we measured the radial dependence of the lineal energy distribution, yf(y), of 290-MeV/u carbon ions and 500-MeV/u iron ion beams. The measured yf(y) distributions and the dose-mean of y, [Formula: see text], were compared with calculations performed with the track structure simulation code TRACION and the microdosimetric function of the Particle and Heavy Ion Transport code System (PHITS). The values of the measured [Formula: see text] were consistent with calculated results within an error of 2%, but differences in the shape of yf(y) were observed for iron ion irradiation. This result indicates that further improvement of the calculation model for yf(y) distribution in PHITS is needed for the analytical function that describes energy deposition by delta rays, particularly for primary ions having linear energy transfer in excess of a few hundred keV μm(-1). © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  12. Feasibility study of nuclear transmutation by negative muon capture reaction using the PHITS code

    NASA Astrophysics Data System (ADS)

    Abe, Shin-ichiro; Sato, Tatsuhiko

    2016-06-01

    Feasibility of nuclear transmutation of fission products in high-level radioactive waste by negative muon capture reaction is investigated using the Particle and Heave Ion Transport code System (PHITS). It is found that about 80 % of stopped negative muons contribute to transmute target nuclide into stable or short-lived nuclide in the case of 135Cs, which is one of the most important nuclide in the transmutation. The simulation result also indicates that the position of transmutation is controllable by changing the energy of incident negative muon. Based on our simulation, it takes approximately 8.5 × 108years to transmute 500 g of 135Cs by negative muon beam with the highest intensity currently available.

  13. Nuclear Reaction Models Responsible for Simulation of Neutron-induced Soft Errors in Microelectronics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, Y., E-mail: watanabe@aees.kyushu-u.ac.jp; Abe, S.

    Terrestrial neutron-induced soft errors in MOSFETs from a 65 nm down to a 25 nm design rule are analyzed by means of multi-scale Monte Carlo simulation using the PHITS-HyENEXSS code system. Nuclear reaction models implemented in PHITS code are validated by comparisons with experimental data. From the analysis of calculated soft error rates, it is clarified that secondary He and H ions provide a major impact on soft errors with decreasing critical charge. It is also found that the high energy component from 10 MeV up to several hundreds of MeV in secondary cosmic-ray neutrons has the most significant sourcemore » of soft errors regardless of design rule.« less

  14. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  15. Measurements and parameterization of neutron energy spectra from targets bombarded with 120 GeV protons

    NASA Astrophysics Data System (ADS)

    Kajimoto, T.; Shigyo, N.; Sanami, T.; Iwamoto, Y.; Hagiwara, M.; Lee, H. S.; Soha, A.; Ramberg, E.; Coleman, R.; Jensen, D.; Leveling, A.; Mokhov, N. V.; Boehnlein, D.; Vaziri, K.; Sakamoto, Y.; Ishibashi, K.; Nakashima, H.

    2014-10-01

    The energy spectra of neutrons were measured by a time-of-flight method for 120 GeV protons on thick graphite, aluminum, copper, and tungsten targets with an NE213 scintillator at the Fermilab Test Beam Facility. Neutron energy spectra were obtained between 25 and 3000 MeV at emission angles of 30°, 45°, 120°, and 150°. The spectra were parameterized as neutron emissions from three moving sources and then compared with theoretical spectra calculated by PHITS and FLUKA codes. The yields of the theoretical spectra were substantially underestimated compared with the yields of measured spectra. The integrated neutron yields from 25 to 3000 MeV calculated with PHITS code were 16-36% of the experimental yields and those calculated with FLUKA code were 26-57% of the experimental yields for all targets and emission angles.

  16. PHITS simulations of the Matroshka experiment

    NASA Astrophysics Data System (ADS)

    Gustafsson, Katarina; Sihver, Lembit; Mancusi, Davide; Sato, Tatsuhiko

    In order to design a more secure space exploration, radiation exposure estimations are necessary; the radiation environment in space is very different from the one on Earth and it is harmful for humans and for electronic equipments. The threat origins from two sources: Galactic Cosmic Rays and Solar Particle Events. It is important to understand what happens when these particles strike matter such as space vehicle walls, human organs and electronics. We are therefore developing a tool able to estimate the radiation exposure to both humans and electronics. The tool will be based on PHITS, the Particle and Heavy-Ion Transport code System, a three dimensional Monte Carlo code which can calculate interactions and transport of particles and heavy ions in matter. PHITS is developed by a collaboration between RIST (Research Organization for Information Science & Technology), JAEA (Japan Atomic Energy Agency), KEK (High Energy Accelerator Research Organization), Japan and Chalmers University of Technology, Sweden. A method for benchmarking and developing the code is to simulate experiments performed in space or on Earth. We have carried out simulations of the Matroshka experiment which focus on determining the radiation load on astronauts inside and outside the International Space Station by using a torso of a tissue equivalent human phantom, filled with active and passive detectors located in the positions of critical tissues and organs. We will present status and results of our simulations.

  17. New Parallel computing framework for radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  18. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Fluence-to-dose conversion coefficients for neutrons and protons calculated using the PHITS code and ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Zankl, Maria; Petoussi-Henss, Nina; Niita, Koji

    2009-04-07

    The fluence to organ-dose and effective-dose conversion coefficients for neutrons and protons with energies up to 100 GeV was calculated using the PHITS code coupled to male and female adult reference computational phantoms, which are to be released as a common ICRP/ICRU publication. For the calculation, the radiation and tissue weighting factors, w(R) and w(T), respectively, as revised in ICRP Publication 103 were employed. The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of the absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. By comparing these data with the corresponding data for the effective dose, we found that the numerical compatibilities of the revised w(R) with the Q(L) and Q(y) relationships are fairly established. The calculated data of these dose conversion coefficients are indispensable for constructing the radiation protection systems based on the new recommendations given in ICRP103 for aircrews and astronauts, as well as for workers in accelerators and nuclear facilities.

  20. International Infantry and Joint Services Small Arms Systems Symposium: Exhibition and Firing Demonstration

    DTIC Science & Technology

    2008-05-22

    operation of weapon system) Phit Weapon System (e.g. dispersion) Most important influence ● Operator ● Distance to target together with ballistic...Suitable for a variety of weapons where ballistical correction to improve range performance and PHit /PKill is essential ● Prepare system for additional...status ● Prototypes have been delivered to FMV (Swedish Defence Materiel Administration) ● Demonstrated for NATO in Toledo 2007-02-15: > 65% PHit

  1. PHIT for Duty, a Mobile Application for Stress Reduction, Sleep Improvement, and Alcohol Moderation.

    PubMed

    Kizakevich, Paul N; Eckhoff, Randall; Brown, Janice; Tueller, Stephen J; Weimer, Belinda; Bell, Stacey; Weeks, Adam; Hourani, Laurel L; Spira, James L; King, Laurel A

    2018-03-01

    Post-traumatic stress and other problems often occur after combat, deployment, and other military operations. Because techniques such as mindfulness meditation show efficacy in improving mental health, our team developed a mobile application (app) for individuals in the armed forces with subclinical psychological problems as secondary prevention of more significant disease. Based on the Personal Health Intervention Toolkit (PHIT), a mobile app framework for personalized health intervention studies, PHIT for Duty integrates mindfulness-based relaxation, behavioral education in sleep quality and alcohol use, and psychometric and psychophysiological data capture. We evaluated PHIT for Duty in usability and health assessment studies to establish app quality for use in health research. Participants (N = 31) rated usability on a 1 (very hard) to 5 (very easy) scale and also completed the System Usability Scale (SUS) questionnaire (N = 9). Results were (mean ± SD) overall (4.5 ± 0.6), self-report instruments (4.5 ± 0.7), pulse sensor (3.7 ± 1.2), sleep monitor (4.4 ± 0.7), sleep monitor comfort (3.7 ± 1.1), and wrist actigraphy comfort (2.7 ± 0.9). The average SUS score was 85 ± 12, indicating a rank of 95%. A comparison of PHIT-based assessments to traditional paper forms demonstrated a high overall correlation (r = 0.87). These evaluations of usability, health assessment accuracy, physiological sensing, system acceptability, and overall functionality have shown positive results and affirmation for using the PHIT framework and PHIT for Duty application in mobile health research.

  2. Development of PARMA: PHITS-based analytical radiation model in the atmosphere.

    PubMed

    Sato, Tatsuhiko; Yasuda, Hiroshi; Niita, Koji; Endo, Akira; Sihver, Lembit

    2008-08-01

    Estimation of cosmic-ray spectra in the atmosphere has been essential for the evaluation of aviation doses. We therefore calculated these spectra by performing Monte Carlo simulation of cosmic-ray propagation in the atmosphere using the PHITS code. The accuracy of the simulation was well verified by experimental data taken under various conditions, even near sea level. Based on a comprehensive analysis of the simulation results, we proposed an analytical model for estimating the cosmic-ray spectra of neutrons, protons, helium ions, muons, electrons, positrons and photons applicable to any location in the atmosphere at altitudes below 20 km. Our model, named PARMA, enables us to calculate the cosmic radiation doses rapidly with a precision equivalent to that of the Monte Carlo simulation, which requires much more computational time. With these properties, PARMA is capable of improving the accuracy and efficiency of the cosmic-ray exposure dose estimations not only for aircrews but also for the public on the ground.

  3. Systematic measurement of double-differential neutron production cross sections for deuteron-induced reactions at an incident energy of 102 MeV

    NASA Astrophysics Data System (ADS)

    Araki, Shouhei; Watanabe, Yukinobu; Kitajima, Mizuki; Sadamatsu, Hiroki; Nakano, Keita; Kin, Tadahiro; Iwamoto, Yosuke; Satoh, Daiki; Hagiwara, Masayuki; Yashima, Hiroshi; Shima, Tatsushi

    2017-01-01

    Double-differential neutron production cross sections (DDXs) for deuteron-induced reactions on Li, Be, C, Al, Cu, and Nb at 102 MeV were measured at forward angles ≤25° by means of a time of flight (TOF) method with NE213 liquid organic scintillators at the Research Center of Nuclear Physics (RCNP), Osaka University. The experimental DDXs and energy-integrated cross sections were compared with TENDL-2015 data and Particle and Heavy Ion Transport code System (PHITS) calculation using a combination of the KUROTAMA model, the Liege Intra-Nuclear Cascade model, and the generalized evaporation model. The PHITS calculation showed better agreement with the experimental results than TENDL-2015 for all target nuclei, although the shape of the broad peak around 50 MeV was not satisfactorily reproduced by the PHITS calculation.

  4. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.

  5. Continuous energy adjoint transport for photons in PHITS

    NASA Astrophysics Data System (ADS)

    Malins, Alex; Machida, Masahiko; Niita, Koji

    2017-09-01

    Adjoint Monte Carlo can be an effcient algorithm for solving photon transport problems where the size of the tally is relatively small compared to the source. Such problems are typical in environmental radioactivity calculations, where natural or fallout radionuclides spread over a large area contribute to the air dose rate at a particular location. Moreover photon transport with continuous energy representation is vital for accurately calculating radiation protection quantities. Here we describe the incorporation of an adjoint Monte Carlo capability for continuous energy photon transport into the Particle and Heavy Ion Transport code System (PHITS). An adjoint cross section library for photon interactions was developed based on the JENDL- 4.0 library, by adding cross sections for adjoint incoherent scattering and pair production. PHITS reads in the library and implements the adjoint transport algorithm by Hoogenboom. Adjoint pseudo-photons are spawned within the forward tally volume and transported through space. Currently pseudo-photons can undergo coherent and incoherent scattering within the PHITS adjoint function. Photoelectric absorption is treated implicitly. The calculation result is recovered from the pseudo-photon flux calculated over the true source volume. A new adjoint tally function facilitates this conversion. This paper gives an overview of the new function and discusses potential future developments.

  6. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald Martin; Remec, Igor; Heilbronn, Lawrence H.

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for designmore » simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".« less

  7. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  8. Measurement of Thick Target Neutron Yields at 0-Degree Bombarded With 140-MeV, 250-MeV And 350-MeV Protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwamoto, Yosuke; /JAERI, Kyoto; Taniguchi, Shingo

    Neutron energy spectra at 0{sup o} produced from stopping-length graphite, aluminum, iron and lead targets bombarded with 140, 250 and 350 MeV protons were measured at the neutron TOF course in RCNP of Osaka University. The neutron energy spectra were obtained by using the time-of-flight technique in the energy range from 10 MeV to incident proton energy. To compare the experimental results, Monte Carlo calculations with the PHITS and MCNPX codes were performed using the JENDL-HE and the LA150 evaluated nuclear data files, the ISOBAR model implemented in PHITS, and the LAHET code in MCNPX. It was found that thesemore » calculated results at 0{sup o} generally agreed with the experimental results in the energy range above 20 MeV except for graphite at 250 and 350 MeV.« less

  9. Reevaluation of secondary neutron spectra from thick targets upon heavy-ion bombardment

    NASA Astrophysics Data System (ADS)

    Satoh, D.; Kurosawa, T.; Sato, T.; Endo, A.; Takada, M.; Iwase, H.; Nakamura, T.; Niita, K.

    2007-12-01

    Previously published data of secondary neutron spectra from thick targets of C, Al, Cu and Pb bombarded with heavy ions from He to Xe are revised by using a new set of neutron-detection efficiency values for a liquid organic scintillator calculated with SCINFUL-QMD. Additional data have been measured for bombardment of C target by 400-MeV/nucleon C ions and 800-MeV/nucleon Si ions. The set of spectra are compared with the calculation results using a Monte-Carlo heavy-ion transport code, PHITS. It was found that PHITS is able to reproduce the secondary neutron spectra in a wide neutron-energy regime.

  10. Fluence-to-dose conversion coefficients for heavy ions calculated using the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Niita, Koji

    2010-04-21

    The fluence to organ-absorbed-dose and effective-dose conversion coefficients for heavy ions with atomic numbers up to 28 and energies from 1 MeV/nucleon to 100 GeV/nucleon were calculated using the PHITS code coupled to the ICRP/ICRU adult reference computational phantoms, following the instruction given in ICRP Publication 103 (2007 (Oxford: Pergamon)). The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. The calculation results indicate that the effective dose can generally give a conservative estimation of the effective dose equivalent for heavy-ion exposure, although it is occasionally too conservative especially for high-energy lighter-ion irradiations. It is also found from the calculation that the conversion coefficients for the Q(y)-based effective dose equivalents are generally smaller than the corresponding Q(L)-based values because of the conceptual difference between LET and y as well as the numerical incompatibility between the Q(L) and Q(y) relationships. The calculated data of these dose conversion coefficients are very useful for the dose estimation of astronauts due to cosmic-ray exposure.

  11. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    NASA Astrophysics Data System (ADS)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  12. Overview of Recent Radiation Transport Code Comparisons for Space Applications

    NASA Astrophysics Data System (ADS)

    Townsend, Lawrence

    Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.

  13. Consideration of the Protection Curtain's Shielding Ability after Identifying the Source of Scattered Radiation in the Angiography.

    PubMed

    Sato, Naoki; Fujibuchi, Toshioh; Toyoda, Takatoshi; Ishida, Takato; Ohura, Hiroki; Miyajima, Ryuichi; Orita, Shinichi; Sueyoshi, Tomonari

    2017-06-15

    To decrease radiation exposure to medical staff performing angiography, the dose distribution in the angiography was calculated in room using the particle and heavy ion transport code system (PHITS), which is based on Monte Carlo code, and the source of scattered radiation was confirmed using a tungsten sheet by considering the difference shielding performance among different sheet placements. Scattered radiation generated from a flat panel detector, X-ray tube and bed was calculated using the PHITS. In this experiment, the source of scattered radiation was identified as the phantom or acrylic window attached to the X-ray tube thus, a protection curtain was placed on the bed to shield against scattered radiation at low positions. There was an average difference of 20% between the measured and calculated values. The H*(10) value decreased after placing the sheet on the right side of the phantom. Thus, the curtain could decrease scattered radiation. © Crown copyright 2016.

  14. Validation of Heavy Ion Transport Capabilities in PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald M.

    The performance of the Monte Carlo code system PHITS is validated for heavy ion transport capabilities by performing simulations and comparing results against experimental data from heavy ion reactions of benchmark quality. These data are from measurements of secondary neutron production cross sections in reactions of Xe at 400 MeV/u with lithium and lead targets, measurements of neutrons outside of thick concrete and iron shields, and measurements of isotope yields produced in the fragmentation of a 140 MeV/u 48Ca beam on a beryllium target and on a tantalum target. A practical example that tests magnetic field capabilities is shown formore » a simulated 48Ca beam at 500 MeV/u striking a lithium target to produce the rare isotope 44Si, with ion transport through a fragmentation-reaction magnetic pre-separator. The results of this study show that PHITS performs reliably for the simulation of radiation fields that is necessary for designing safe, reliable and cost effective future high-powered heavy-ion accelerators in rare isotope beam facilities.« less

  15. Simulation of the ALTEA experiment with Monte Carlo (PHITS) and deterministic (GNAC, SihverCC and Tripathi97) codes

    NASA Astrophysics Data System (ADS)

    La Tessa, Chiara; Mancusi, Davide; Rinaldi, Adele; di Fino, Luca; Zaconte, Veronica; Larosa, Marianna; Narici, Livio; Gustafsson, Katarina; Sihver, Lembit

    ALTEA-Space is the principal in-space experiment of an international and multidisciplinary project called ALTEA (Anomalus Long Term Effects on Astronauts). The measurements were performed on the International Space Station between August 2006 and July 2007 and aimed at characterising the space radiation environment inside the station. The analysis of the collected data provided the abundances of elements with charge 5 ≤ Z ≤ 26 and energy above 100 MeV/nucleon. The same results have been obtained by simulating the experiment with the three-dimensional Monte Carlo code PHITS (Particle and Heavy Ion Transport System). The simulation reproduces accurately the composition of the space radiation environment as well as the geometry of the experimental apparatus; moreover the presence of several materials, e.g. the spacecraft hull and the shielding, that surround the device has been taken into account. An estimate of the abundances has also been calculated with the help of experimental fragmentation cross sections taken from literature and predictions of the deterministic codes GNAC, SihverCC and Tripathi97. The comparison between the experimental and simulated data has two important aspects: it validates the codes giving possible hints how to benchmark them; it helps to interpret the measurements and therefore have a better understanding of the results.

  16. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  17. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    DTIC Science & Technology

    2007-01-01

    parameter dimension between the two models). 93 were tested.3 Model 1 log( pHits 1− pHits ) = α + β1 ∗ MetricScore (6.6) The results for each of the...505.67 oTERavg .357 .13 .007 log( pHits 1− pHits ), that is, log-odds of correct task performance, of 2.79 over the intercept only model. All... pHits 1− pHits ) = −1.15− .418× I[MT=2] − .527× I[MT=3] + 1.78×METEOR+ 1.28×METEOR × I[MT=2] + 1.86×METEOR × I[MT=3] (6.7) Model 3 log( pHits 1− pHits

  18. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    PubMed

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-06-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions and other data-collection instruments. For example, if a user anxiety score exceeds a threshold, the iVA might add a meditation intervention to the task list in order to teach the user how to relax, and schedule a reassessment using the anxiety instrument 2 weeks later to re-evaluate. If the anxiety score exceeds a higher threshold, then an advisory to seek professional help would be displayed. Using the easy-to-use PHIT scripting language, the researcher can program new instruments, the iVA, and interventions to their domain-specific needs. The iVA, instruments, and interventions are defined via XML files, which facilities rapid app development and deployment. The PHIT Web-based dashboard portal provides the researcher access to all the uploaded data. After a secure login, the data can be filtered by criteria such as study, protocol, domain, and user. Data can also be exported into a comma-delimited file for further processing. The PHIT framework has proven to be an extensible, reconfigurable technology that facilitates mobile data collection and health intervention research. Additional plans include instrument development in other domains, additional health sensors, and a text messaging notification system.

  19. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT)

    PubMed Central

    2015-01-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app’s deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions and other data-collection instruments. For example, if a user anxiety score exceeds a threshold, the iVA might add a meditation intervention to the task list in order to teach the user how to relax, and schedule a reassessment using the anxiety instrument 2 weeks later to re-evaluate. If the anxiety score exceeds a higher threshold, then an advisory to seek professional help would be displayed. Using the easy-to-use PHIT scripting language, the researcher can program new instruments, the iVA, and interventions to their domain-specific needs. The iVA, instruments, and interventions are defined via XML files, which facilities rapid app development and deployment. The PHIT Web-based dashboard portal provides the researcher access to all the uploaded data. After a secure login, the data can be filtered by criteria such as study, protocol, domain, and user. Data can also be exported into a comma-delimited file for further processing. The PHIT framework has proven to be an extensible, reconfigurable technology that facilitates mobile data collection and health intervention research. Additional plans include instrument development in other domains, additional health sensors, and a text messaging notification system. PMID:26033047

  20. Comprehensive and integrated district health systems strengthening: the Rwanda Population Health Implementation and Training (PHIT) Partnership

    PubMed Central

    2013-01-01

    Background Nationally, health in Rwanda has been improving since 2000, with considerable improvement since 2005. Despite improvements, rural areas continue to lag behind urban sectors with regard to key health outcomes. Partners In Health (PIH) has been supporting the Rwanda Ministry of Health (MOH) in two rural districts in Rwanda since 2005. Since 2009, the MOH and PIH have spearheaded a health systems strengthening (HSS) intervention in these districts as part of the Rwanda Population Health Implementation and Training (PHIT) Partnership. The partnership is guided by the belief that HSS interventions should be comprehensive, integrated, responsive to local conditions, and address health care access, cost, and quality. The PHIT Partnership represents a collaboration between the MOH and PIH, with support from the National University of Rwanda School of Public Health, the National Institute of Statistics, Harvard Medical School, and Brigham and Women’s Hospital. Description of intervention The PHIT Partnership’s health systems support aligns with the World Health Organization’s six health systems building blocks. HSS activities focus across all levels of the health system — community, health center, hospital, and district leadership — to improve health care access, quality, delivery, and health outcomes. Interventions are concentrated on three main areas: targeted support for health facilities, quality improvement initiatives, and a strengthened network of community health workers. Evaluation design The impact of activities will be assessed using population-level outcomes data collected through oversampling of the demographic and health survey (DHS) in the intervention districts. The overall impact evaluation is complemented by an analysis of trends in facility health care utilization. A comprehensive costing project captures the total expenditures and financial inputs of the health care system to determine the cost of systems improvement. Targeted evaluations and operational research pieces focus on specific programmatic components, supported by partnership-supported work to build in-country research capacity. Discussion Building on early successes, the work of the Rwanda PHIT Partnership approach to HSS has already seen noticeable increases in facility capacity and quality of care. The rigorous planned evaluation of the Partnership’s HSS activities will contribute to global knowledge about intervention methodology, cost, and population health impact. PMID:23819573

  1. Comprehensive and integrated district health systems strengthening: the Rwanda Population Health Implementation and Training (PHIT) Partnership.

    PubMed

    Drobac, Peter C; Basinga, Paulin; Condo, Jeanine; Farmer, Paul E; Finnegan, Karen E; Hamon, Jessie K; Amoroso, Cheryl; Hirschhorn, Lisa R; Kakoma, Jean Baptise; Lu, Chunling; Murangwa, Yusuf; Murray, Megan; Ngabo, Fidele; Rich, Michael; Thomson, Dana; Binagwaho, Agnes

    2013-01-01

    Nationally, health in Rwanda has been improving since 2000, with considerable improvement since 2005. Despite improvements, rural areas continue to lag behind urban sectors with regard to key health outcomes. Partners In Health (PIH) has been supporting the Rwanda Ministry of Health (MOH) in two rural districts in Rwanda since 2005. Since 2009, the MOH and PIH have spearheaded a health systems strengthening (HSS) intervention in these districts as part of the Rwanda Population Health Implementation and Training (PHIT) Partnership. The partnership is guided by the belief that HSS interventions should be comprehensive, integrated, responsive to local conditions, and address health care access, cost, and quality. The PHIT Partnership represents a collaboration between the MOH and PIH, with support from the National University of Rwanda School of Public Health, the National Institute of Statistics, Harvard Medical School, and Brigham and Women's Hospital. The PHIT Partnership's health systems support aligns with the World Health Organization's six health systems building blocks. HSS activities focus across all levels of the health system - community, health center, hospital, and district leadership - to improve health care access, quality, delivery, and health outcomes. Interventions are concentrated on three main areas: targeted support for health facilities, quality improvement initiatives, and a strengthened network of community health workers. The impact of activities will be assessed using population-level outcomes data collected through oversampling of the demographic and health survey (DHS) in the intervention districts. The overall impact evaluation is complemented by an analysis of trends in facility health care utilization. A comprehensive costing project captures the total expenditures and financial inputs of the health care system to determine the cost of systems improvement. Targeted evaluations and operational research pieces focus on specific programmatic components, supported by partnership-supported work to build in-country research capacity. Building on early successes, the work of the Rwanda PHIT Partnership approach to HSS has already seen noticeable increases in facility capacity and quality of care. The rigorous planned evaluation of the Partnership's HSS activities will contribute to global knowledge about intervention methodology, cost, and population health impact.

  2. Application of JAERI quantum molecular dynamics model for collisions of heavy nuclei

    NASA Astrophysics Data System (ADS)

    Ogawa, Tatsuhiko; Hashimoto, Shintaro; Sato, Tatsuhiko; Niita, Koji

    2016-06-01

    The quantum molecular dynamics (QMD) model incorporated into the general-purpose radiation transport code PHITS was revised for accurate prediction of fragment yields in peripheral collisions. For more accurate simulation of peripheral collisions, stability of the nuclei at their ground state was improved and the algorithm to reject invalid events was modified. In-medium correction on nucleon-nucleon cross sections was also considered. To clarify the effect of this improvement on fragmentation of heavy nuclei, the new QMD model coupled with a statistical decay model was used to calculate fragment production cross sections of Ag and Au targets and compared with the data of earlier measurement. It is shown that the revised version can predict cross section more accurately.

  3. Comparison of simulations with PHITS and HIBRAC with experimental data in the context of particle therapy monitoring

    PubMed Central

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2014-01-01

    Therapeutic irradiation with protons and ions is advantageous over radiotherapy with photons due to its favorable dose deposition. Additionally, ion beams provide a higher relative biological effectiveness than photons. For this reason, an improved treatment of deep-seated tumors is achieved and normal tissue is spared. However, small deviations from the treatment plan can have a large impact on the dose distribution. Therefore, a monitoring is required to assure the quality of the treatment. Particle therapy positron emission tomography (PT-PET) is the only clinically proven method which provides a non-invasive monitoring of dose delivery. It makes use of the β+-activity produced by nuclear fragmentation during irradiation. In order to evaluate these PT-PET measurements, simulations of the β+-activity are necessary. Therefore, it is essential to know the yields of the β+-emitting nuclides at every position of the beam path as exact as possible. We evaluated the three-dimensional Monte-Carlo simulation tool PHITS (version 2.30) [ 1] and the 1D deterministic simulation tool HIBRAC [ 2] with respect to the production of β+-emitting nuclides. The yields of the most important β+-emitting nuclides for carbon, lithium, helium and proton beams have been calculated. The results were then compared with experimental data obtained at GSI Helmholtzzentrum für Schwerionenforschung Darmstadt, Germany. GEANT4 simulations provide an additional benchmark [ 3]. For PHITS, the impact of different nuclear reaction models, total cross-section models and evaporation models on the β+-emitter production has been studied. In general, PHITS underestimates the yields of positron-emitters and cannot compete with GEANT4 so far. The β+-emitters calculated with an extended HIBRAC code were in good agreement with the experimental data for carbon and proton beams and comparable to the GEANT4 results, see [ 4] and Fig. 1. Considering the simulation results and its speed compared with three-dimensional Monte-Carlo tools, HIBRAC is a good candidate for the implementation in clinical routine PT-PET. Fig 1.Depth-dependent yields of the production of 11C and 15O during proton irradiation of a PMMA target with 140 MeV [ 4].

  4. The Accuracy of Tank Main Armaments.

    DTIC Science & Technology

    1987-04-07

    width (m) 1.4,3.2 hull height, width (m) 0.5,1.0043,1.1233,0.357,0.0, rr,o’ffrP&PY The program produces the following hit probabilities: a) Phit -0.52 for...hull defllade b) Phit =0.74 for ully exposed c) Phit -0.94 for the standard NATO target. The calculation of subsequent round hit probabilities is a more...hit probabilities: a) Phit =0.66 for hull defilade b) Phit =0.86 for fully exposed c) Phit =0.98 for the standard NATO target. Moving Firer Versus

  5. New Approach for Nuclear Reaction Model in the Combination of Intra-nuclear Cascade and DWBA

    NASA Astrophysics Data System (ADS)

    Hashimoto, S.; Iwamoto, O.; Iwamoto, Y.; Sato, T.; Niita, K.

    2014-04-01

    We applied a new nuclear reaction model that is a combination of the intra nuclear cascade model and the distorted wave Born approximation (DWBA) calculation to estimate neutron spectra in reactions induced by protons incident on 7Li and 9Be targets at incident energies below 50 MeV, using the particle and heavy ion transport code system (PHITS). The results obtained by PHITS with the new model reproduce the sharp peaks observed in the experimental double-differential cross sections as a result of taking into account transitions between discrete nuclear states in the DWBA. An excellent agreement was observed between the calculated results obtained using the combination model and experimental data on neutron yields from thick targets in the inclusive (p, xn) reaction.

  6. Radiation transport simulation of the Martian GCR surface flux and dose estimation using spherical geometry in PHITS compared to MSL-RAD measurements

    NASA Astrophysics Data System (ADS)

    Flores-McLaughlin, John

    2017-08-01

    Planetary bodies and spacecraft are predominantly exposed to isotropic radiation environments that are subject to transport and interaction in various material compositions and geometries. Specifically, the Martian surface radiation environment is composed of galactic cosmic radiation, secondary particles produced by their interaction with the Martian atmosphere, albedo particles from the Martian regolith and occasional solar particle events. Despite this complex physical environment with potentially significant locational and geometric dependencies, computational resources often limit radiation environment calculations to a one-dimensional or slab geometry specification. To better account for Martian geometry, spherical volumes with respective Martian material densities are adopted in this model. This physical description is modeled with the PHITS radiation transport code and compared to a portion of measurements from the Radiation Assessment Detector of the Mars Science Laboratory. Particle spectra measured between 15 November 2015 and 15 January 2016 and PHITS model results calculated for this time period are compared. Results indicate good agreement between simulated dose rates, proton, neutron and gamma spectra. This work was originally presented at the 1st Mars Space Radiation Modeling Workshop held in 2016 in Boulder, CO.

  7. Radiation transport simulation of the Martian GCR surface flux and dose estimation using spherical geometry in PHITS compared to MSL-RAD measurements.

    PubMed

    Flores-McLaughlin, John

    2017-08-01

    Planetary bodies and spacecraft are predominantly exposed to isotropic radiation environments that are subject to transport and interaction in various material compositions and geometries. Specifically, the Martian surface radiation environment is composed of galactic cosmic radiation, secondary particles produced by their interaction with the Martian atmosphere, albedo particles from the Martian regolith and occasional solar particle events. Despite this complex physical environment with potentially significant locational and geometric dependencies, computational resources often limit radiation environment calculations to a one-dimensional or slab geometry specification. To better account for Martian geometry, spherical volumes with respective Martian material densities are adopted in this model. This physical description is modeled with the PHITS radiation transport code and compared to a portion of measurements from the Radiation Assessment Detector of the Mars Science Laboratory. Particle spectra measured between 15 November 2015 and 15 January 2016 and PHITS model results calculated for this time period are compared. Results indicate good agreement between simulated dose rates, proton, neutron and gamma spectra. This work was originally presented at the 1st Mars Space Radiation Modeling Workshop held in 2016 in Boulder, CO. Copyright © 2017. Published by Elsevier Ltd.

  8. Neutron Productions from thin Be target irradiated by 50 MeV/u 238U beam

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Seock; Oh, Joo-Hee; Jung, Nam-Suk; Oranj, Leila Mokhtari; Nakao, Noriaki; Uwamino, Yoshitomo

    2017-09-01

    Neutrons generated from thin beryllium target by 50 MeV/u 238U beam were measured using activation analysis at 15, 30, 45, and 90 degrees from the beam direction. A 0.085 mm-thick Be stripper of RIBF was used as the neutron generating target. Activation detectors of bismuth, cobalt, and aluminum were placed out of the stripper chamber. The threshold reactions of 209Bi(n, xn)210-xBi(x=4 8), 59Co(n, xn)60-xCO(x=2 5), 59Co(n, 2nα)54Mn, 27Al(n, α)24Na, and 27Al(n,2nα)22Na were applied to measure the production rates of radionuclides. The neutron spectra were obtained using an unfolding method with the SAND-II code. All of production rates and neutron spectra were compared with the calculated results using Monte Carlo codes, the PHITS and the FLUKA. The FLUKA results showed better agreement with the measurements than the PHITS. The discrepancy between the measurements and the calculations were discussed.

  9. Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes

    NASA Astrophysics Data System (ADS)

    Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela

    2017-09-01

    The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.

  10. New approach to description of (d,xn) spectra at energies below 50 MeV in Monte Carlo simulation by intra-nuclear cascade code with Distorted Wave Born Approximation

    NASA Astrophysics Data System (ADS)

    Hashimoto, S.; Iwamoto, Y.; Sato, T.; Niita, K.; Boudard, A.; Cugnon, J.; David, J.-C.; Leray, S.; Mancusi, D.

    2014-08-01

    A new approach to describing neutron spectra of deuteron-induced reactions in the Monte Carlo simulation for particle transport has been developed by combining the Intra-Nuclear Cascade of Liège (INCL) and the Distorted Wave Born Approximation (DWBA) calculation. We incorporated this combined method into the Particle and Heavy Ion Transport code System (PHITS) and applied it to estimate (d,xn) spectra on natLi, 9Be, and natC targets at incident energies ranging from 10 to 40 MeV. Double differential cross sections obtained by INCL and DWBA successfully reproduced broad peaks and discrete peaks, respectively, at the same energies as those observed in experimental data. Furthermore, an excellent agreement was observed between experimental data and PHITS-derived results using the combined method in thick target neutron yields over a wide range of neutron emission angles in the reactions. We also applied the new method to estimate (d,xp) spectra in the reactions, and discussed the validity for the proton emission spectra.

  11. Determining Optimal Evacuation Decision Policies for Disasters

    DTIC Science & Technology

    2012-03-01

    18 3.3 Calculating the Hit Probability ( Phit ) . . . . . . . . . . . . . . . . . . 20 3.4 Phit versus Vertical...23 Figure 3.13 Large Probability Matrix (Map) . . . . . . . . . . . . . . . . . . . . . 24 Figure 3.14 Particle Trajectory with Phit data...26 Figure 3.15 Phit versus Vertical Volatility . . . . . . . . . . . . . . . . . . . . . . 27 Figure 4.1 Cost-To

  12. Approved Methods and Algorithms for DoD Risk-Based Explosives Siting

    DTIC Science & Technology

    2007-02-02

    glass. Pgha Probability of a person being in the glass hazard area Phit Probability of hit Phit (f) Probability of hit for fatality Phit (maji...Probability of hit for major injury Phit (mini) Probability of hit for minor injury Pi Debris probability densities at the ES PMaj (pair) Individual...combined high-angle and combined low-angle tables. A unique probability of hit is calculated for the three consequences of fatality, Phit (f), major injury

  13. From Amorphous to Defined: Balancing the Risks of Spiral Development

    DTIC Science & Technology

    2007-04-30

    630 675 720 765 810 855 900 Time (Week) Work started and active PhIt [Requirements,Iter1] : JavelinCalibration work packages1 1 1 Work started and...active PhIt [Technology,Iter1] : JavelinCalibration work packages2 2 2 Work started and active PhIt [Design,Iter1] : JavelinCalibration work packages3 3 3 3...Work started and active PhIt [Manufacturing,Iter1] : JavelinCalibration work packages4 4 Work started and active PhIt [Use,Iter1] : JavelinCalibration

  14. Analyzing the Surface Warfare Operational Effectiveness of an Offshore Patrol Vessel using Agent Based Modeling

    DTIC Science & Technology

    2012-09-01

    20 Figure 6. Marte Missile Phit – Range Profile...22 Figure 7. Exocet Missile Phit – Range Profile .................................................................22 Figure 8. Gun Phit – Range...in the OSN model. Factors like range and Phit probability plots and agent dependent factors could be directly implemented in MANA with little effort

  15. Too Little Too Soon? Modeling the Risks of Spiral Development

    DTIC Science & Technology

    2007-04-30

    270 315 360 405 450 495 540 585 630 675 720 765 810 855 900 Time (Week) Work started and active PhIt [Requirements,Iter1] : JavelinCalibration work...packages1 1 1 Work started and active PhIt [Technology,Iter1] : JavelinCalibration work packages2 2 2 Work started and active PhIt [Design,Iter1...JavelinCalibration work packages3 3 3 3 Work started and active PhIt [Manufacturing,Iter1] : JavelinCalibration work packages4 4 Work started and active PhIt

  16. Validation of the physical and RBE-weighted dose estimator based on PHITS coupled with a microdosimetric kinetic model for proton therapy.

    PubMed

    Takada, Kenta; Sato, Tatsuhiko; Kumada, Hiroaki; Koketsu, Junichi; Takei, Hideyuki; Sakurai, Hideyuki; Sakae, Takeji

    2018-01-01

    The microdosimetric kinetic model (MKM) is widely used for estimating relative biological effectiveness (RBE)-weighted doses for various radiotherapies because it can determine the surviving fraction of irradiated cells based on only the lineal energy distribution, and it is independent of the radiation type and ion species. However, the applicability of the method to proton therapy has not yet been investigated thoroughly. In this study, we validated the RBE-weighted dose calculated by the MKM in tandem with the Monte Carlo code PHITS for proton therapy by considering the complete simulation geometry of the clinical proton beam line. The physical dose, lineal energy distribution, and RBE-weighted dose for a 155 MeV mono-energetic and spread-out Bragg peak (SOBP) beam of 60 mm width were evaluated. In estimating the physical dose, the calculated depth dose distribution by irradiating the mono-energetic beam using PHITS was consistent with the data measured by a diode detector. A maximum difference of 3.1% in the depth distribution was observed for the SOBP beam. In the RBE-weighted dose validation, the calculated lineal energy distributions generally agreed well with the published measurement data. The calculated and measured RBE-weighted doses were in excellent agreement, except at the Bragg peak region of the mono-energetic beam, where the calculation overestimated the measured data by ~15%. This research has provided a computational microdosimetric approach based on a combination of PHITS and MKM for typical clinical proton beams. The developed RBE-estimator function has potential application in the treatment planning system for various radiotherapies. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  17. Validation of the physical and RBE-weighted dose estimator based on PHITS coupled with a microdosimetric kinetic model for proton therapy

    PubMed Central

    Sato, Tatsuhiko; Kumada, Hiroaki; Koketsu, Junichi; Takei, Hideyuki; Sakurai, Hideyuki; Sakae, Takeji

    2018-01-01

    Abstract The microdosimetric kinetic model (MKM) is widely used for estimating relative biological effectiveness (RBE)-weighted doses for various radiotherapies because it can determine the surviving fraction of irradiated cells based on only the lineal energy distribution, and it is independent of the radiation type and ion species. However, the applicability of the method to proton therapy has not yet been investigated thoroughly. In this study, we validated the RBE-weighted dose calculated by the MKM in tandem with the Monte Carlo code PHITS for proton therapy by considering the complete simulation geometry of the clinical proton beam line. The physical dose, lineal energy distribution, and RBE-weighted dose for a 155 MeV mono-energetic and spread-out Bragg peak (SOBP) beam of 60 mm width were evaluated. In estimating the physical dose, the calculated depth dose distribution by irradiating the mono-energetic beam using PHITS was consistent with the data measured by a diode detector. A maximum difference of 3.1% in the depth distribution was observed for the SOBP beam. In the RBE-weighted dose validation, the calculated lineal energy distributions generally agreed well with the published measurement data. The calculated and measured RBE-weighted doses were in excellent agreement, except at the Bragg peak region of the mono-energetic beam, where the calculation overestimated the measured data by ~15%. This research has provided a computational microdosimetric approach based on a combination of PHITS and MKM for typical clinical proton beams. The developed RBE-estimator function has potential application in the treatment planning system for various radiotherapies. PMID:29087492

  18. Calculation of dose contributions of electron and charged heavy particles inside phantoms irradiated by monoenergetic neutron.

    PubMed

    Satoh, Daiki; Takahashi, Fumiaki; Endo, Akira; Ohmachi, Yasushi; Miyahara, Nobuyuki

    2008-09-01

    The radiation-transport code PHITS with an event generator mode has been applied to analyze energy depositions of electrons and charged heavy particles in two spherical phantoms and a voxel-based mouse phantom upon neutron irradiation. The calculations using the spherical phantoms quantitatively clarified the type and energy of charged particles which are released through interactions of neutrons with the phantom elements and contribute to the radiation dose. The relative contribution of electrons increased with an increase in the size of the phantom and with a decrease in the energy of the incident neutrons. Calculations with the voxel-based mouse phantom for 2.0-MeV neutron irradiation revealed that the doses to different locations inside the body are uniform, and that the energy is mainly deposited by recoil protons. The present study has demonstrated that analysis using PHITS can yield dose distributions that are accurate enough for RBE evaluation.

  19. Efficient Matrix Models for Relational Learning

    DTIC Science & Technology

    2009-10-01

    74 4.5.3 Comparison to pLSI- pHITS . . . . . . . . . . . . . . . . . . . . 76 5 Hierarchical Bayesian Collective...Behaviour of Newton vs. Stochastic Newton on a three-factor model. 4.5.3 Comparison to pLSI- pHITS Caveat: Collective Matrix Factorization makes no guarantees...leads to better results; and another where a co-clustering model, pLSI- pHITS , has the advantage. pLSI- pHITS [24] is a relational clustering technique

  20. Analog quadrature signal to phase angle data conversion by a quadrature digitizer and quadrature counter

    DOEpatents

    Buchenauer, C. Jerald

    1984-01-01

    The quadrature phase angle .phi.(t) of a pair of quadrature signals S.sub.1 (t) and S.sub.2 (t) is digitally encoded on a real time basis by a quadrature digitizer for fractional .phi.(t) rotational excursions and by a quadrature up/down counter for full .phi.(t) rotations. The pair of quadrature signals are of the form S.sub.1 (t)=k(t) sin .phi.(t) and S.sub.2 (t)=k(t) cos .phi.(t) where k(t) is a signal common to both. The quadrature digitizer and the quadrature up/down counter may be used together or singularly as desired or required. Optionally, a digital-to-analog converter may follow the outputs of the quadrature digitizer and the quadrature up/down counter to provide an analog signal output of the quadrature phase angle .phi.(t).

  1. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2015-04-01

    Award Number: W81XWH-11-2-0129 TITLE: PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury...TITLE AND SUBTITLE 5a. CONTRACT NUMBER W81XWH-11-2-0129 PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic...health problems. PHIT for Duty integrates self-report and physiological sensor instruments to assess health status via brief weekly screening

  2. Approved Methods and Algorithms for DoD Risk-Based Explosives Siting

    DTIC Science & Technology

    2009-07-21

    Parameter used in determining probability of hit ( Phit ) by debris. [Table 31, Table 32, Table 33, Eq. (157), Eq. (158)] CCa Variable “Actual...being in the glass hazard area”. [Eq. (60), Eq. (78)] Phit Variable “Probability of hit”. An array value indexed by consequence and mass bin...Eq. (156), Eq. (157)] Phit (f) Variable “Probability of hit for fatality”. [Eq. (157), Eq. (158)] Phit (maji) Variable “Probability of hit for major

  3. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  4. SURVIAC Bulletin: RPG Encounter Modeling, Vol 27, Issue 1, 2012

    DTIC Science & Technology

    2012-01-01

    return a probability of hit ( PHIT ) for the scenario. In the model, PHIT depends on the presented area of the targeted system and a set of errors infl...simplifying assumptions, is data-driven, and uses simple yet proven methodologies to determine PHIT . Th e inputs to THREAT describe the target, the RPG, and...Point on 2-D Representation of a CH-47 Th e determination of PHIT by THREAT is performed using one of two possible methodologies. Th e fi rst is a

  5. Computation of Weapons Systems Effectiveness

    DTIC Science & Technology

    2013-09-01

    denoted as SSPD2. SSPD = SSPD1 ∗ PNM + SSPD2 ∗ PHIT (5.13) PNM and PHIT are the weighing factors used to balance the direct hits and Gaussian miss...distribution unique for guided weapons. The addition of PNM and PHIT can be equal to or smaller than 1 due to the presence of the outliers gross...weapons to represent a zero miss distance for the PHIT component. SSPD2 Computation for Blast Effect SSPD2x = normcdf � LET 2 , 0,0� − normcdf(−LET 2

  6. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative.

    PubMed

    Mutale, Wilbroad; Chintu, Namwinga; Amoroso, Cheryl; Awoonor-Williams, Koku; Phillips, James; Baynes, Colin; Michel, Cathy; Taylor, Angela; Sherr, Kenneth

    2013-01-01

    Weak health information systems (HIS) are a critical challenge to reaching the health-related Millennium Development Goals because health systems performance cannot be adequately assessed or monitored where HIS data are incomplete, inaccurate, or untimely. The Population Health Implementation and Training (PHIT) Partnerships were established in five sub-Saharan African countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia) to catalyze advances in strengthening district health systems. Interventions were tailored to the setting in which activities were planned. All five PHIT Partnerships share a common feature in their goal of enhancing HIS and linking data with improved decision-making, specific strategies varied. Mozambique, Ghana, and Tanzania all focus on improving the quality and use of the existing Ministry of Health HIS, while the Zambia and Rwanda partnerships have introduced new information and communication technology systems or tools. All partnerships have adopted a flexible, iterative approach in designing and refining the development of new tools and approaches for HIS enhancement (such as routine data quality audits and automated troubleshooting), as well as improving decision making through timely feedback on health system performance (such as through summary data dashboards or routine data review meetings). The most striking differences between partnership approaches can be found in the level of emphasis of data collection (patient versus health facility), and consequently the level of decision making enhancement (community, facility, district, or provincial leadership). Design differences across PHIT Partnerships reflect differing theories of change, particularly regarding what information is needed, who will use the information to affect change, and how this change is expected to manifest. The iterative process of data use to monitor and assess the health system has been heavily communication dependent, with challenges due to poor feedback loops. Implementation to date has highlighted the importance of engaging frontline staff and managers in improving data collection and its use for informing system improvement. Through rigorous process and impact evaluation, the experience of the PHIT teams hope to contribute to the evidence base in the areas of HIS strengthening, linking HIS with decision making, and its impact on measures of health system outputs and impact.

  7. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative

    PubMed Central

    2013-01-01

    Background Weak health information systems (HIS) are a critical challenge to reaching the health-related Millennium Development Goals because health systems performance cannot be adequately assessed or monitored where HIS data are incomplete, inaccurate, or untimely. The Population Health Implementation and Training (PHIT) Partnerships were established in five sub-Saharan African countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia) to catalyze advances in strengthening district health systems. Interventions were tailored to the setting in which activities were planned. Comparisons across strategies All five PHIT Partnerships share a common feature in their goal of enhancing HIS and linking data with improved decision-making, specific strategies varied. Mozambique, Ghana, and Tanzania all focus on improving the quality and use of the existing Ministry of Health HIS, while the Zambia and Rwanda partnerships have introduced new information and communication technology systems or tools. All partnerships have adopted a flexible, iterative approach in designing and refining the development of new tools and approaches for HIS enhancement (such as routine data quality audits and automated troubleshooting), as well as improving decision making through timely feedback on health system performance (such as through summary data dashboards or routine data review meetings). The most striking differences between partnership approaches can be found in the level of emphasis of data collection (patient versus health facility), and consequently the level of decision making enhancement (community, facility, district, or provincial leadership). Discussion Design differences across PHIT Partnerships reflect differing theories of change, particularly regarding what information is needed, who will use the information to affect change, and how this change is expected to manifest. The iterative process of data use to monitor and assess the health system has been heavily communication dependent, with challenges due to poor feedback loops. Implementation to date has highlighted the importance of engaging frontline staff and managers in improving data collection and its use for informing system improvement. Through rigorous process and impact evaluation, the experience of the PHIT teams hope to contribute to the evidence base in the areas of HIS strengthening, linking HIS with decision making, and its impact on measures of health system outputs and impact. PMID:23819699

  8. PHITS simulations of absorbed dose out-of-field and neutron energy spectra for ELEKTA SL25 medical linear accelerator.

    PubMed

    Puchalska, Monika; Sihver, Lembit

    2015-06-21

    Monte Carlo (MC) based calculation methods for modeling photon and particle transport, have several potential applications in radiotherapy. An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. It is also essential to minimize the dose to radiosensitive and critical organs. With MC technique, the dose distributions from both the primary and scattered photons can be calculated. The out-of-field radiation doses are of particular concern when high energy photons are used, since then neutrons are produced both in the accelerator head and inside the patients. Using MC technique, the created photons and particles can be followed and the transport and energy deposition in all the tissues of the patient can be estimated. This is of great importance during pediatric treatments when minimizing the risk for normal healthy tissue, e.g. secondary cancer. The purpose of this work was to evaluate 3D general purpose PHITS MC code efficiency as an alternative approach for photon beam specification. In this study, we developed a model of an ELEKTA SL25 accelerator and used the transport code PHITS for calculating the total absorbed dose and the neutron energy spectra infield and outside the treatment field. This model was validated against measurements performed with bubble detector spectrometers and Boner sphere for 18 MV linacs, including both photons and neutrons. The average absolute difference between the calculated and measured absorbed dose for the out-of-field region was around 11%. Taking into account a simplification for simulated geometry, which does not include any potential scattering materials around, the obtained result is very satisfactorily. A good agreement between the simulated and measured neutron energy spectra was observed while comparing to data found in the literature.

  9. PHITS simulations of absorbed dose out-of-field and neutron energy spectra for ELEKTA SL25 medical linear accelerator

    NASA Astrophysics Data System (ADS)

    Puchalska, Monika; Sihver, Lembit

    2015-06-01

    Monte Carlo (MC) based calculation methods for modeling photon and particle transport, have several potential applications in radiotherapy. An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. It is also essential to minimize the dose to radiosensitive and critical organs. With MC technique, the dose distributions from both the primary and scattered photons can be calculated. The out-of-field radiation doses are of particular concern when high energy photons are used, since then neutrons are produced both in the accelerator head and inside the patients. Using MC technique, the created photons and particles can be followed and the transport and energy deposition in all the tissues of the patient can be estimated. This is of great importance during pediatric treatments when minimizing the risk for normal healthy tissue, e.g. secondary cancer. The purpose of this work was to evaluate 3D general purpose PHITS MC code efficiency as an alternative approach for photon beam specification. In this study, we developed a model of an ELEKTA SL25 accelerator and used the transport code PHITS for calculating the total absorbed dose and the neutron energy spectra infield and outside the treatment field. This model was validated against measurements performed with bubble detector spectrometers and Boner sphere for 18 MV linacs, including both photons and neutrons. The average absolute difference between the calculated and measured absorbed dose for the out-of-field region was around 11%. Taking into account a simplification for simulated geometry, which does not include any potential scattering materials around, the obtained result is very satisfactorily. A good agreement between the simulated and measured neutron energy spectra was observed while comparing to data found in the literature.

  10. Computational Transport Modeling of High-Energy Neutrons Found in the Space Environment

    NASA Technical Reports Server (NTRS)

    Cox, Brad; Theriot, Corey A.; Rohde, Larry H.; Wu, Honglu

    2012-01-01

    The high charge and high energy (HZE) particle radiation environment in space interacts with spacecraft materials and the human body to create a population of neutrons encompassing a broad kinetic energy spectrum. As an HZE ion penetrates matter, there is an increasing chance of fragmentation as penetration depth increases. When an ion fragments, secondary neutrons are released with velocities up to that of the primary ion, giving some neutrons very long penetration ranges. These secondary neutrons have a high relative biological effectiveness, are difficult to effectively shield, and can cause more biological damage than the primary ions in some scenarios. Ground-based irradiation experiments that simulate the space radiation environment must account for this spectrum of neutrons. Using the Particle and Heavy Ion Transport Code System (PHITS), it is possible to simulate a neutron environment that is characteristic of that found in spaceflight. Considering neutron dosimetry, the focus lies on the broad spectrum of recoil protons that are produced in biological targets. In a biological target, dose at a certain penetration depth is primarily dependent upon recoil proton tracks. The PHITS code can be used to simulate a broad-energy neutron spectrum traversing biological targets, and it account for the recoil particle population. This project focuses on modeling a neutron beamline irradiation scenario for determining dose at increasing depth in water targets. Energy-deposition events and particle fluence can be simulated by establishing cross-sectional scoring routines at different depths in a target. This type of model is useful for correlating theoretical data with actual beamline radiobiology experiments. Other work exposed human fibroblast cells to a high-energy neutron source to study micronuclei induction in cells at increasing depth behind water shielding. Those findings provide supporting data describing dose vs. depth across a water-equivalent medium. This poster presents PHITS data suggesting an increase in dose, up to roughly 10 cm depth, followed by a continual decrease as neutrons come to a stop in the target.

  11. PARMA: PHITS-based Analytical Radiation Model in the Atmosphere--Verification of Its Accuracy in Estimating Cosmic Radiation Doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sato, Tatsuhiko; Satoh, Daiki; Endo, Akira

    Estimation of cosmic-ray spectra in the atmosphere has been an essential issue in the evaluation of the aircrew doses. We therefore developed an analytical model that can predict the terrestrial neutron, proton, He nucleus, muon, electron, positron and photon spectra at altitudes below 20 km, based on the Monte Carlo simulation results of cosmic-ray propagation in the atmosphere performed by the PHITS code. The model was designated PARMA. In order to examine the accuracy of PARMA in terms of the neutron dose estimation, we measured the neutron dose rates at the altitudes between 20 to 10400 m, using our developedmore » dose monitor DARWIN mounted on an aircraft. Excellent agreement was observed between the measured dose rates and the corresponding data calculated by PARMA coupled with the fluence-to-dose conversion coefficients, indicating the applicability of the model to be utilized in the route-dose calculation.« less

  12. Well There’s Your Problem: Isolating the Crash-Inducing Bits in a Fuzzed File

    DTIC Science & Technology

    2012-10-01

    CurrHD end for for all CDChance[i] do Calculate Phit [i] [see (7)] BitReduction[i] ← CurrHD × CDChance[i] ExpectedReduction[i] ← Phit [i...at least one hit left to be found in the search space. Here we use the identity that phit = 1 — pmiss (14) and observe that the chance of getting...at least one hit in x tries is P (≥ 1_hit_in_ x_ tries) = 1 – pxmiss =1 − (1 − phit )x (15) Another way to interpret equation (15) is that if we try x

  13. The Probability of Hitting a Polygonal Target

    DTIC Science & Technology

    1981-04-01

    required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The

  14. Estimation of Airborne Radioactivity Induced by 8-GeV-Class Electron LINAC Accelerator.

    PubMed

    Asano, Yoshihiro

    2017-10-01

    Airborne radioactivity induced by high-energy electrons from 6 to 10 GeV is estimated by using analytical methods and the Monte Carlo codes PHITS and FLUKA. Measurements using a gas monitor with a NaI(Tl) scintillator are carried out in air from a dump room at SACLA, an x-ray free-electron laser facility with 7.8-GeV electrons and are compared to the simulations.

  15. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation

    PubMed Central

    Ozaki, Y.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.

    2017-01-01

    Abstract Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. PMID:28339846

  16. Modeling Cell and Tumor-Metastasis Dosimetry with the Particle and Heavy Ion Transport Code System (PHITS) Software for Targeted Alpha-Particle Radionuclide Therapy.

    PubMed

    Lee, Dongyoul; Li, Mengshi; Bednarz, Bryan; Schultz, Michael K

    2018-06-26

    The use of targeted radionuclide therapy for cancer is on the rise. While beta-particle-emitting radionuclides have been extensively explored for targeted radionuclide therapy, alpha-particle-emitting radionuclides are emerging as effective alternatives. In this context, fundamental understanding of the interactions and dosimetry of these emitted particles with cells in the tumor microenvironment is critical to ascertaining the potential of alpha-particle-emitting radionuclides. One important parameter that can be used to assess these metrics is the S-value. In this study, we characterized several alpha-particle-emitting radionuclides (and their associated radionuclide progeny) regarding S-values in the cellular and tumor-metastasis environments. The Particle and Heavy Ion Transport code System (PHITS) was used to obtain S-values via Monte Carlo simulation for cell and tumor metastasis resulting from interactions with the alpha-particle-emitting radionuclides, lead-212 ( 212 Pb), actinium-225 ( 225 Ac) and bismuth-213 ( 213 Bi); these values were compared to the beta-particle-emitting radionuclides yttrium-90 ( 90 Y) and lutetium-177 ( 177 Lu) and an Auger-electron-emitting radionuclide indium-111 ( 111 In). The effect of cellular internalization on S-value was explored at increasing degree of internalization for each radionuclide. This aspect of S-value determination was further explored in a cell line-specific fashion for six different cancer cell lines based on the cell dimensions obtained by confocal microscopy. S-values from PHITS were in good agreement with MIRDcell S-values (cellular S-values) and the values found by Hindié et al. (tumor S-values). In the cellular model, 212 Pb and 213 Bi decay series produced S-values that were 50- to 120-fold higher than 177 Lu, while 225 Ac decay series analysis suggested S-values that were 240- to 520-fold higher than 177 Lu. S-values arising with 100% cellular internalization were two- to sixfold higher for the nucleus when compared to 0% internalization. The tumor dosimetry model defines the relative merit of radionuclides and suggests alpha particles may be effective for large tumors as well as small tumor metastases. These results from PHITS modeling substantiate emerging evidence that alpha-particle-emitting radionuclides may be an effective alternative to beta-particle-emitting radionuclides for targeted radionuclide therapy due to preferred dose-deposition profiles in the cellular and tumor metastasis context. These results further suggest that internalization of alpha-particle-emitting radionuclides via radiolabeled ligands may increase the relative biological effectiveness of radiotherapeutics.

  17. Carrier Air Wing Tactics Incorporating Navy Unmanned Combat Air System (NUCAS)

    DTIC Science & Technology

    2010-03-01

    Profile Curves of Mean Target Casualty Rate Versus GBU-31 Phit and NUCAS Sensor Aperture (After SAS Institute, 2010...Prediction Profile Curve of Mean Blue Survivability Percent Versus AIM- 120 Weapons Phit (After SAS Institute, 2010...Weapons Phit is a major factor in target destruction and blue survivability. Our approach shows how simulation, data farming techniques, and data

  18. Individual Combatant’s Weapons Firing Algorithm

    DTIC Science & Technology

    2010-04-01

    target selection prioritization scheme, aim point, mode of fire, and estimates on Phit /Pmiss for a single SME. Also undertaken in this phase of the...5 APPENDIX A: SME FUZZY ESTIMATES ON FACTORS AND ESTIMATES ON PHIT /PMISS.....6...influencing the target selection prioritization scheme, aim point, mode of fire, and estimates on Phit /Pmiss for a single SME. Also undertaken in this

  19. Excitation functions of the natCr(p,x)44Ti, 56Fe(p,x)44Ti, natNi(p,x)44Ti and 93Nb(p,x)44Ti reactions at energies up to 2.6 GeV

    NASA Astrophysics Data System (ADS)

    Titarenko, Yu. E.; Batyaev, V. F.; Pavlov, K. V.; Titarenko, A. Yu.; Zhivun, V. M.; Chauzova, M. V.; Balyuk, S. A.; Bebenin, P. V.; Ignatyuk, A. V.; Mashnik, S. G.; Leray, S.; Boudard, A.; David, J. C.; Mancusi, D.; Cugnon, J.; Yariv, Y.; Nishihara, K.; Matsuda, N.; Kumawat, H.; Stankovskiy, A. Yu.

    2016-06-01

    The paper presents the measured cumulative yields of 44Ti for natCr, 56Fe, natNi and 93Nb samples irradiated by protons at the energy range 0.04-2.6 GeV. The obtained excitation functions are compared with calculations of the well-known codes: ISABEL, Bertini, INCL4.2+ABLA, INCL4.5+ABLA07, PHITS, CASCADE07 and CEM03.02. The predictive power of these codes regarding the studied nuclides is analyzed.

  20. Estimation of relative biological effectiveness for boron neutron capture therapy using the PHITS code coupled with a microdosimetric kinetic model

    PubMed Central

    Horiguchi, Hironori; Sato, Tatsuhiko; Kumada, Hiroaki; Yamamoto, Tetsuya; Sakae, Takeji

    2015-01-01

    Abstract The absorbed doses deposited by boron neutron capture therapy (BNCT) can be categorized into four components: α and 7Li particles from the 10B(n, α)7Li reaction, 0.54-MeV protons from the 14N(n, p)14C reaction, the recoiled protons from the 1H(n, n) 1H reaction, and photons from the neutron beam and 1H(n, γ)2H reaction. For evaluating the irradiation effect in tumors and the surrounding normal tissues in BNCT, it is of great importance to estimate the relative biological effectiveness (RBE) for each dose component in the same framework. We have, therefore, established a new method for estimating the RBE of all BNCT dose components on the basis of the microdosimetric kinetic model. This method employs the probability density of lineal energy, y, in a subcellular structure as the index for expressing RBE, which can be calculated using the microdosimetric function implemented in the particle transport simulation code (PHITS). The accuracy of this method was tested by comparing the calculated RBE values with corresponding measured data in a water phantom irradiated with an epithermal neutron beam. The calculation technique developed in this study will be useful for biological dose estimation in treatment planning for BNCT. PMID:25428243

  1. Low-Resolution Screening of Early Stage Acquisition Simulation Scenario Development Decisions

    DTIC Science & Technology

    2012-12-01

    6 seconds) incorporating reload times and assumptions. Phit for min range is assumed to be 100% (excepting FGM- 148, which was estimated for a...User Interface HTN Hierarchical Task Network MCCDC Marine Corps Combat Development Command Phit Probability to hit the intended target Pkill...well beyond the scope of this study. 5. Weapon Capabilities Translation COMBATXXI develops situation probabilities to hit ( Phit ) and probabilities to

  2. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation.

    PubMed

    Ozaki, Y; Watanabe, H; Kaida, A; Miura, M; Nakagawa, K; Toda, K; Yoshimura, R; Sumi, Y; Kurabayashi, T

    2017-07-01

    Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  3. Simulations of the MATROSHKA experiment at the international space station using PHITS.

    PubMed

    Sihver, L; Sato, T; Puchalska, M; Reitz, G

    2010-08-01

    Concerns about the biological effects of space radiation are increasing rapidly due to the perspective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a preparation for these long-duration space missions, it is important to ensure an excellent capability to evaluate the impact of space radiation on human health, in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radiation load on the personnel both inside and outside the space vehicles and certify that organ- and tissue-equivalent doses can be simulated as accurate as possible. In this paper, simulations are presented using the three-dimensional Monte Carlo Particle and Heavy-Ion Transport code System (PHITS) (Iwase et al. in J Nucl Sci Tech 39(11):1142-1151, 2002) of long-term dose measurements performed with the European Space Agency-supported MATROSHKA (MTR) experiment (Reitz and Berger in Radiat Prot Dosim 120:442-445, 2006). MATROSHKA is an anthropomorphic phantom containing over 6,000 radiation detectors, mimicking a human head and torso. The MTR experiment, led by the German Aerospace Center (DLR), was launched in January 2004 and has measured the absorbed doses from space radiation both inside and outside the ISS. Comparisons of simulations with measurements outside the ISS are presented. The results indicate that PHITS is a suitable tool for estimation of doses received from cosmic radiation and for study of the shielding of spacecraft against cosmic radiation.

  4. COMPARISON OF COSMIC-RAY ENVIRONMENTS ON EARTH, MOON, MARS AND IN SPACECARFT USING PHITS.

    PubMed

    Sato, Tatsuhiko; Nagamatsu, Aiko; Ueno, Haruka; Kataoka, Ryuho; Miyake, Shoko; Takeda, Kazuo; Niita, Koji

    2017-09-29

    Estimation of cosmic-ray doses is of great importance not only in aircrew and astronaut dosimetry but also in evaluation of background radiation exposure to public. We therefore calculated the cosmic-ray doses on Earth, Moon and Mars as well as inside spacecraft, using Particle and Heavy Ion Transport code System PHITS. The same cosmic-ray models and dose conversion coefficients were employed in the calculation to properly compare between the simulation results for different environments. It is quantitatively confirmed that the thickness of physical shielding including the atmosphere and soil of the planets is the most important parameter to determine the cosmic-ray doses and their dominant contributors. The comparison also suggests that higher solar activity significantly reduces the astronaut doses particularly for the interplanetary missions. The information obtained from this study is useful in the designs of the future space missions as well as accelerator-based experiments dedicated to cosmic-ray research. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Neutronics Assessments for a RIA Fragmentation Line Beam Dump Concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boles, J L; Reyes, S; Ahle, L E

    Heavy ion and radiation transport calculations are in progress for conceptual beam dump designs for the fragmentation line of the proposed Rare Isotope Accelerator (RIA). Using the computer code PHITS, a preliminary design of a motor-driven rotating wheel beam dump and adjacent downstream multipole has been modeled. Selected results of these calculations are given, including neutron and proton flux in the wheel, absorbed dose and displacements per atom in the hub materials, and heating from prompt radiation and from decay heat in the multipole.

  6. Improvement of low energy atmospheric neutrino flux calculation using the JAM nuclear interaction model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honda, M.; Kajita, T.; Kasahara, K.

    We present the calculation of the atmospheric neutrino fluxes with an interaction model named JAM, which is used in PHITS (Particle and Heavy-Ion Transport code System) [K. Niita et al., Radiation Measurements 41, 1080 (2006).]. The JAM interaction model agrees with the HARP experiment [H. Collaboration, Astropart. Phys. 30, 124 (2008).] a little better than DPMJET-III[S. Roesler, R. Engel, and J. Ranft, arXiv:hep-ph/0012252.]. After some modifications, it reproduces the muon flux below 1 GeV/c at balloon altitudes better than the modified DPMJET-III, which we used for the calculation of atmospheric neutrino flux in previous works [T. Sanuki, M. Honda, T.more » Kajita, K. Kasahara, and S. Midorikawa, Phys. Rev. D 75, 043005 (2007).][M. Honda, T. Kajita, K. Kasahara, S. Midorikawa, and T. Sanuki, Phys. Rev. D 75, 043006 (2007).]. Some improvements in the calculation of atmospheric neutrino flux are also reported.« less

  7. Measurement of DT and DD components in neutron spectrum with a double-crystal time-of-flight spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okada, K.; Okamoto, A.; Kitajima, S.

    To investigate the deuteron and triton density ratio in core plasmas, a new methodology with measurement of tritium (DT) and deuterium (DD) neutron count rate ratio using a double-crystal time-of-flight (TOF) spectrometer is proposed. Multi-discriminator electronic circuits for the first and second detectors are used in addition to the TOF technique. The optimum arrangement of the detectors and discrimination window were examined considering the relations between the geometrical arrangement and deposited energy using a Monte Carlo Code, PHITS (Particle and Heavy Ion Transport Code System). An experiment to verify the calculations was performed using DD neutrons from an accelerator.

  8. FRENDY: A new nuclear data processing system being developed at JAEA

    NASA Astrophysics Data System (ADS)

    Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

    2017-09-01

    JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.

  9. Multi-scale modeling of irradiation effects in spallation neutron source materials

    NASA Astrophysics Data System (ADS)

    Yoshiie, T.; Ito, T.; Iwase, H.; Kaneko, Y.; Kawai, M.; Kishida, I.; Kunieda, S.; Sato, K.; Shimakawa, S.; Shimizu, F.; Hashimoto, S.; Hashimoto, N.; Fukahori, T.; Watanabe, Y.; Xu, Q.; Ishino, S.

    2011-07-01

    Changes in mechanical property of Ni under irradiation by 3 GeV protons were estimated by multi-scale modeling. The code consisted of four parts. The first part was based on the Particle and Heavy-Ion Transport code System (PHITS) code for nuclear reactions, and modeled the interactions between high energy protons and nuclei in the target. The second part covered atomic collisions by particles without nuclear reactions. Because the energy of the particles was high, subcascade analysis was employed. The direct formation of clusters and the number of mobile defects were estimated using molecular dynamics (MD) and kinetic Monte-Carlo (kMC) methods in each subcascade. The third part considered damage structural evolutions estimated by reaction kinetic analysis. The fourth part involved the estimation of mechanical property change using three-dimensional discrete dislocation dynamics (DDD). Using the above four part code, stress-strain curves for high energy proton irradiated Ni were obtained.

  10. Heterogeneous Defensive Naval Weapon Assignment To Swarming Threats In Real Time

    DTIC Science & Technology

    2016-03-01

    threat Damage potential of target t if it hits the ship [integer from 0 to 3] _ ttarget phit Probability that target t hits the ship [probability...secondary weapon systems on target t [integer] _ tsec phit Probability that secondary weapon systems launched from target t hit the ship...pairing. These parameters are calculated as follows: 310 _ _t t tpriority target threat target phit = × × (3.1) 3_ 10 _ _t t tsec priority sec

  11. Measuring and Modeling Behavioral Decision Dynamics in Collective Evacuation

    DTIC Science & Technology

    2014-02-10

    Phit (t), was generated in advance from a well-defined stochastic process previously studied in [67]; details of its construction can be found there. The...value of Phit (t) on the Disaster Tab which is updated every second, Figure 1. Overview of behavioral network science experiment. A: Experimental setup at...Volume 9 | Issue 2 | e87380 7 however the overall trajectory is not shown. There were a total of 23 Phit (t) trajectories used in the experiment, with many

  12. Relational Learning via Collective Matrix Factorization

    DTIC Science & Technology

    2008-06-01

    well-known example of such a schema is pLSI- pHITS [13], which models document-word counts and document-document citations: E1 = words and E2 = E3...relational co- clustering include pLSI, pLSI- pHITS , the symmetric block models of Long et. al. [23, 24, 25], and Bregman tensor clustering [5] (which can...to pLSI- pHITS In this section we provide an example where the additional flexibility of collective matrix factorization leads to better results; and

  13. Estimation of relative biological effectiveness for boron neutron capture therapy using the PHITS code coupled with a microdosimetric kinetic model.

    PubMed

    Horiguchi, Hironori; Sato, Tatsuhiko; Kumada, Hiroaki; Yamamoto, Tetsuya; Sakae, Takeji

    2015-03-01

    The absorbed doses deposited by boron neutron capture therapy (BNCT) can be categorized into four components: α and (7)Li particles from the (10)B(n, α)(7)Li reaction, 0.54-MeV protons from the (14)N(n, p)(14)C reaction, the recoiled protons from the (1)H(n, n) (1)H reaction, and photons from the neutron beam and (1)H(n, γ)(2)H reaction. For evaluating the irradiation effect in tumors and the surrounding normal tissues in BNCT, it is of great importance to estimate the relative biological effectiveness (RBE) for each dose component in the same framework. We have, therefore, established a new method for estimating the RBE of all BNCT dose components on the basis of the microdosimetric kinetic model. This method employs the probability density of lineal energy, y, in a subcellular structure as the index for expressing RBE, which can be calculated using the microdosimetric function implemented in the particle transport simulation code (PHITS). The accuracy of this method was tested by comparing the calculated RBE values with corresponding measured data in a water phantom irradiated with an epithermal neutron beam. The calculation technique developed in this study will be useful for biological dose estimation in treatment planning for BNCT. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  14. Shielding of relativistic protons.

    PubMed

    Bertucci, A; Durante, M; Gialanella, G; Grossi, G; Manti, L; Pugliese, M; Scampoli, P; Mancusi, D; Sihver, L; Rusek, A

    2007-06-01

    Protons are the most abundant element in the galactic cosmic radiation, and the energy spectrum peaks around 1 GeV. Shielding of relativistic protons is therefore a key problem in the radiation protection strategy of crewmembers involved in long-term missions in deep space. Hydrogen ions were accelerated up to 1 GeV at the NASA Space Radiation Laboratory, Brookhaven National Laboratory, New York. The proton beam was also shielded with thick (about 20 g/cm2) blocks of lucite (PMMA) or aluminium (Al). We found that the dose rate was increased 40-60% by the shielding and decreased as a function of the distance along the axis. Simulations using the General-Purpose Particle and Heavy-Ion Transport code System (PHITS) show that the dose increase is mostly caused by secondary protons emitted by the target. The modified radiation field after the shield has been characterized for its biological effectiveness by measuring chromosomal aberrations in human peripheral blood lymphocytes exposed just behind the shield block, or to the direct beam, in the dose range 0.5-3 Gy. Notwithstanding the increased dose per incident proton, the fraction of aberrant cells at the same dose in the sample position was not significantly modified by the shield. The PHITS code simulations show that, albeit secondary protons are slower than incident nuclei, the LET spectrum is still contained in the low-LET range (<10 keV/microm), which explains the approximately unitary value measured for the relative biological effectiveness.

  15. SU-F-T-46: The Effect of Inter-Seed Attenuation and Tissue Composition in Prostate 125I Brachytherapy Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamura, K; Araki, F; Ohno, T

    Purpose: To investigate the difference of dose distributions with/without the effect of inter-seed attenuation and tissue compositions in prostate {sup 125}I brachytherapy dose calculations, using Monte Carlo simulations of Particle and Heavy Ion Transport code System (PHITS). Methods: The dose distributions in {sup 125}I prostate brachytherapy were calculated using PHITS for non-simultaneous and simultaneous alignments of STM1251 sources in water or prostate phantom for six patients. The PHITS input file was created from DICOM-RT file which includes source coordinates and structures for clinical target volume (CTV) and organs at risk (OARs) of urethra and rectum, using in-house Matlab software. Photonmore » and electron cutoff energies were set to 1 keV and 100 MeV, respectively. The dose distributions were calculated with the kerma approximation and the voxel size of 1 × 1 × 1 mm{sup 3}. The number of incident photon was set to be the statistical uncertainty (1σ) of less than 1%. The effect of inter-seed attenuation and prostate tissue compositions was evaluated from dose volume histograms (DVHs) for each structure, by comparing to results of the AAPM TG-43 dose calculation (without the effect of inter-seed attenuation and prostate tissue compositions). Results: The dose reduction due to the inter-seed attenuation by source capsules was approximately 2% for CTV and OARs compared to those of TG-43. In additions, by considering prostate tissue composition, the D{sub 90} and V{sub 100} of CTV reduced by 6% and 1%, respectively. Conclusion: It needs to consider the dose reduction due to the inter-seed attenuation and tissue composition in prostate {sup 125}I brachytherapy dose calculations.« less

  16. Simulations of MATROSHKA experiments at ISS using PHITS

    NASA Astrophysics Data System (ADS)

    Puchalska, Monika; Sihver, L.; Sato, T.; Berger, T.; Reitz, G.

    Concerns about the biological effects of space radiation are increasing rapidly due to the per-spective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a prepara-tion for these long duration space missions it is important to ensure an excellent capability to evaluate the impact of space radiation on human health in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radi-ation load on the personnel both inside and outside the space vehicles and certify that organ and tissue equivalent doses can be simulated as accurate as possible. In this paper we will present simulations using the three-dimensional Monte Carlo Particle and Heavy Ion Transport code System (PHITS) of long term dose measurements performed with the ESA supported ex-periment MATROSHKA (MTR), which is an anthropomorphic phantom containing over 6000 radiation detectors, mimicking a human head and torso. The MTR experiment, led by the German Aerospace Center (DLR), was launched in January 2004 and has measured the ab-sorbed dose from space radiation both inside and outside the ISS. In this paper preliminary comparisons of measured and calculated dose and organ doses in the MTR located outside the ISS will be presented. The results confirm previous calculations and measurements which indicate that PHITS is a suitable tool for estimations of dose received from cosmic radiation and when performing shielding design studies of spacecraft. Acknowledgement: The research leading to these results has received funding from the Euro-pean Commission in the frame of the FP7 HAMLET project (Project 218817).

  17. Development of the 3DHZETRN code for space radiation protection

    NASA Astrophysics Data System (ADS)

    Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert

    Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.

  18. Research capacity building integrated into PHIT projects: leveraging research and research funding to build national capacity.

    PubMed

    Hedt-Gauthier, Bethany L; Chilengi, Roma; Jackson, Elizabeth; Michel, Cathy; Napua, Manuel; Odhiambo, Jackline; Bawah, Ayaga

    2017-12-21

    Inadequate research capacity impedes the development of evidence-based health programming in sub-Saharan Africa. However, funding for research capacity building (RCB) is often insufficient and restricted, limiting institutions' ability to address current RCB needs. The Doris Duke Charitable Foundation's African Health Initiative (AHI) funded Population Health Implementation and Training (PHIT) partnership projects in five African countries (Ghana, Mozambique, Rwanda, Tanzania and Zambia) to implement health systems strengthening initiatives inclusive of RCB. Using Cooke's framework for RCB, RCB activity leaders from each country reported on RCB priorities, activities, program metrics, ongoing challenges and solutions. These were synthesized by the authorship team, identifying common challenges and lessons learned. For most countries, each of the RCB domains from Cooke's framework was a high priority. In about half of the countries, domain specific activities happened prior to PHIT. During PHIT, specific RCB activities varied across countries. However, all five countries used AHI funding to improve research administrative support and infrastructure, implement research trainings and support mentorship activities and research dissemination. While outcomes data were not systematically collected, countries reported holding 54 research trainings, forming 56 mentor-mentee relationships, training 201 individuals and awarding 22 PhD and Masters-level scholarships. Over the 5 years, 116 manuscripts were developed. Of the 59 manuscripts published in peer-reviewed journals, 29 had national first authors and 18 had national senior authors. Trainees participated in 99 conferences and projects held 37 forums with policy makers to facilitate research translation into policy. All five PHIT projects strongly reported an increase in RCB activities and commended the Doris Duke Charitable Foundation for prioritizing RCB, funding RCB at adequate levels and time frames and for allowing flexibility in funding so that each project could implement activities according to their trainees' needs. As a result, many common challenges for RCB, such as adequate resources and local and international institutional support, were not identified as major challenges for these projects. Overall recommendations are for funders to provide adequate and flexible funding for RCB activities and for institutions to offer a spectrum of RCB activities to enable continued growth, provide adequate mentorship for trainees and systematically monitor RCB activities.

  19. DIVWAG Model Documentation. Volume II. Programmer/Analyst Manual. Part 3. Chapter 7 Through 8.

    DTIC Science & Technology

    1976-07-01

    platoon area is circular. 2. The center of impact of the volley coincides with the center of the circular platoon area. (c) The fraction ( PHIT ) of rounds of...the volley expected to fall within the platoon area then is calculated as: PHIT = 1 - exp (-APLAT/27rr 2 ), where APLAT is the area (in square meters...type located in the platoon area. This is accomplished as follows: IV-8-16 CASi = Ni * [I - exp (- PHIT *LAi*NOR)] (IV-8-11) where CAS, = number of losses

  20. Preliminary Design of an Alternative Fuels Combined Cycle Propulsion Plant for Naval Ship Applications.

    DTIC Science & Technology

    1981-06-01

    SIGMA Y- 2 .S tE D RTzSQRT(M*V/( PHIT *UTP*(..-VA**92))) C.2=-1*THETA*RHCM*IUT’**2/2. C3-2.*SIGMAY/(SF*ALPHA) H(VB-V) /20 SUMMO.0 X1=o.0 0010JUl,21 MNCl...Yl) /(l.+Yl) )CHT. 4*3*~)/P:*v7225!*.t2 *(3.*Yl)*( PHIT *,+.49*YlW*2)Wr( PHIT *n+7225*Yl*w2)*w (-2)) 0D0- -2. *C2*Y I IP(Xl .EQ. 0.0) GO TO 50 SUM

  1. MEMS PolyMUMPS-Based Miniature Microphone for Directional Sound Sensing

    DTIC Science & Technology

    2007-09-01

    of the translating mode Phir=-atan((2*wr*er*w)/(wr^2-w^2));% Phase constant rocking Phit =-atan((2*wt*et*w)/(wt^2-w^2));% Phase constant translating...2.5e-6)+1 Yl(count)=8e6*(At*sin(w.*t(count)+ Phit ) + Ar*cos(w.*t(count)+Phir)); %left membrane displacement as a function of time in micrometers...Xl(count)=-(((.5)^2-Yl(count).^2).^.5); Yr(count)=8e6*(At*sin(w.*t(count)+ Phit ) - Ar*cos(w.*t(count)+Phir)); %right membrane displacement

  2. International Infantry and Joint Services Small Arms Systems Symposium, Exhibition and Firing Demonstration

    DTIC Science & Technology

    2009-05-21

    range performance and PHit /PKill is essential ● System prepared for additional functionalities as technology matures Aimpoint BR8 – NATO Demo Aimpoint...have been delivered to FMV (Swedish Defence Materiel Administration) ● Demonstrated for NATO in Toledo 2007- 02-15: > 65% PHit at 1.2x1.2m targets...from 100 to 250m! ● Demonstrated in Sweden 2008-10-01: > 80% PHit at different targets from 120 to 150m! ● 100 units ordered in May 2009 for use on

  3. C6 GPMG and 40 mm AGL Weapon Integrated on RWS Mounted on TAPV Platform: Probability of Hit Methodology

    DTIC Science & Technology

    2010-09-01

    nationale, 2010 DRDC Valcartier CR 2010-237 i Abstract …….. A probability of hit ( PHit ) methodology has been developed to characterize the...CFB (Canadian Forces Base). Résumé …..... Une méthodologie de probabilité d’impact ( PHit ) a été développée pour caractériser la performance globale...the crew commander and gunner from their respective crew stations inside the vehicle. A probability of hit ( PHit ) methodology has been developed to

  4. Inherent Error in Asynchronous Digital Flight Controls.

    DTIC Science & Technology

    1980-02-01

    IMAXJ DIIMENSION AP(2,2) PBIP(2v2),CP(lv2) ,FC(2,2) iGC(2,1) , PHIT ’(4,4), I HC(2v2)vEC(2w1)wPHITI(4,4,l01),PSITI(4,4),PHTAU(4,4) ,PSTAU(4,4),I 4 INJ3EX(4...1.0 4 02. CON’T I NUE [𔃺 4 11 :: 2 7NT TI =T14-E:’LTA PHITl(2yJ1,11) =0.0 4 FHIT1(2y2vlJ.) EXP(-10.*TI) [DO 400 11 = lNP DO0 400 JJ = INP 400 PHIT (IlJJ...PHIT1(IIr,J.JYN’T) WRITE(6y860) 860 FORMAT (5X Y’PH IT’) [DO 861 1 -IYiNP 1361 WRITE(6v802) ( PHIT (I9vJ) ,J:=1,NP) DO :1800 KK2 1,J.6 IAU N1

  5. Computational investigation of suitable polymer gel composition for the QA of the beam components of a BNCT irradiation field.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Hayashi, Shin-Ichiro; Kajimoto, Tsuyoshi; Uchida, Ryohei; Tanaka, Hiroki; Takata, Takushi; Bengua, Gerard; Endo, Satoru

    2017-09-01

    This study investigated the optimum composition of the MAGAT polymer gel which is to be used in the quality assurance measurement of the thermal neutron, fast neutron and gamma ray components in the irradiation field used for boron neutron capture therapy at the Kyoto University Reactor. Simulations using the PHITS code showed that when combined with the gel, 6 Li concentrations of 0, 10 and 100ppm were found to be potentially usable. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Experimental approach to measure thick target neutron yields induced by heavy ions for shielding

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Brouillard, C.; Clerc, T.; Damoy, S.; Desmezières, V.; Dessay, E.; Dupuis, M.; Grinyer, G. F.; Grinyer, J.; Jacquot, B.; Ledoux, X.; Madeline, A.; Menard, N.; Michel, M.; Morel, V.; Porée, F.; Rannou, B.; Savalle, A.

    2017-09-01

    Double differential (angular and energy) neutron distributions were measured using an activation foil technique. Reactions were induced by impinging two low-energy heavy-ion beams accelerated with the GANIL CSS1 cyclotron: (36S (12 MeV/u) and 208Pb (6.25 MeV/u)) onto thick natCu targets. Results have been compared to Monte-Carlo calculations from two codes (PHITS and FLUKA) for the purpose of benchmarking radiation protection and shielding requirements. This comparison suggests a disagreement between calculations and experiment, particularly for high-energy neutrons.

  7. Neutron density profile in the lunar subsurface produced by galactic cosmic rays

    NASA Astrophysics Data System (ADS)

    Ota, Shuya; Sihver, Lembit; Kobayashi, Shingo; Hasebe, Nobuyuki

    Neutron production by galactic cosmic rays (GCR) in the lunar subsurface is very important when performing lunar and planetary nuclear spectroscopy and space dosimetry. Further im-provements to estimate the production with increased accuracy is therefore required. GCR, which is a main contributor to the neutron production in the lunar subsurface, consists of not only protons but also of heavy components such as He, C, N, O, and Fe. Because of that, it is important to precisely estimate the neutron production from such components for the lunar spectroscopy and space dosimetry. Therefore, the neutron production from GCR particles in-cluding heavy components in the lunar subsurface was simulated with the Particle and Heavy ion Transport code System (PHITS), using several heavy ion interaction models. This work presents PHITS simulations of the neutron density as a function of depth (neutron density profile) in the lunar subsurface and the results are compared with experimental data obtained by Apollo 17 Lunar Neutron Probe Experiment (LNPE). From our previous study, it has been found that the accuracy of the proton-induced neutron production models is the most influen-tial factor when performing precise calculations of neutron production in the lunar subsurface. Therefore, a benchmarking of proton-induced neutron production models against experimental data was performed to estimate and improve the precision of the calculations. It was found that the calculated neutron production using the best model of Cugnon Old (E < 3 GeV) and JAM (E > 3 GeV) gave up to 30% higher values than experimental results. Therefore, a high energy nuclear data file (JENDL-HE) was used instead of the Cugnon Old model at the energies below 3 GeV. Then, the calculated neutron density profile successfully reproduced the experimental data from LNPE within experimental errors of 15% (measurement) + 30% (systematic). In this presentation, we summarize and discuss our calculated results of neutron production in the lunar subsurface.

  8. GEANT4 and PHITS simulations of the shielding of neutrons from the 252Cf source

    NASA Astrophysics Data System (ADS)

    Shin, Jae Won; Hong, Seung-Woo; Bak, Sang-In; Kim, Do Yoon; Kim, Chong Yeal

    2014-09-01

    Monte Carlo simulations are performed by using the GEANT4 and the PHITS for studying the neutron-shielding abilities of several materials, such as graphite, iron, polyethylene, NS-4-FR and KRAFTON-HB. As a neutron source, 252Cf is considered. For the Monte Carlo simulations by using the GEANT4, high precision (G4HP) models with the G4NDL 4.2 based on ENDF/B-VII data are used. For the simulations by using the PHITS, the JENDL-4.0 library is used. The neutron-dose-equivalent rates with or without five different shielding materials are estimated and compared with the experimental values. The differences between the shielding abilities calculated by using the GEANT4 with the G4NDL 4.2 and the PHITS with the JENDL-4.0 are found not to be significant for all the cases considered in this work. The neutron-dose-equivalent rates obtained by using the GEANT4 and the PHITS are compared with experimental data and other simulation results. Our neutron-dose-equivalent rates agree well with the experimental dose-equivalent rates, within 20% errors, except for polyethylene. For polyethylene, the discrepancies between our calculations and the experiments are less than 40%, as observed in other simulation results.

  9. Pion and electromagnetic contribution to dose: Comparisons of HZETRN to Monte Carlo results and ISS data

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.; Reddell, Brandon; Bahadori, Amir; Norman, Ryan B.; Badavi, Francis F.

    2013-07-01

    Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar-O'Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.

  10. Circular Probable Error for Circular and Noncircular Gaussian Impacts

    DTIC Science & Technology

    2012-09-01

    1M simulated impacts ph(k)=mean(imp(:,1).^2+imp(:,2).^2<=CEP^2); % hit frequency on CEP end phit (j)=mean(ph...avg 100 hit frequencies to “incr n” end % GRAPHICS plot(i, phit ,’r-’); % error exponent versus Ph estimate

  11. Organic Over-the-Horizon Targeting for the 2025 Surface Fleet

    DTIC Science & Technology

    2015-06-01

    Detection Phit Probability of Hit Pk Probability of Kill PLAN People’s Liberation Army Navy PMEL Pacific Marine Environmental Laboratory...probability of hit ( Phit ). 2. Top-Level Functional Flow Block Diagram With the high-level functions of the project’s systems of systems properly

  12. Warship Combat System Selection Methodology Based on Discrete Event Simulation

    DTIC Science & Technology

    2010-09-01

    Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the

  13. 42nd Annual Armament Systems: Gun and Missile Systems

    DTIC Science & Technology

    2007-04-26

    to compare the various contenders: • FCS • Aero and flight dynamics of rounds • Phit and lethality • Direct and indirect fire capability Defence R&D...each other). • Guidance: Unguided, Command Guidance, Lock on Before Launch, Autonomous (needs Phit analysis). • Fuzing: Proximity – RF or Optical

  14. Study on detecting spatial distribution of neutrons and gamma rays using a multi-imaging plate system.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Endo, Satoru; Takada, Jun

    2014-06-01

    In order to measure the spatial distributions of neutrons and gamma rays separately using the imaging plate, the requirement for the converter to enhance specific component was investigated with the PHITS code. Consequently, enhancing fast neutrons using recoil protons from epoxy resin was not effective due to high sensitivity of the imaging plate to gamma rays. However, the converter of epoxy resin doped with (10)B was found to have potential for thermal and epithermal neutrons, and graphite for gamma rays. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Target depth dependence of damage rate in metals by 150 MeV proton irradiation

    NASA Astrophysics Data System (ADS)

    Yoshiie, T.; Ishi, Y.; Kuriyama, Y.; Mori, Y.; Sato, K.; Uesugi, T.; Xu, Q.

    2015-01-01

    A series of irradiation experiments with 150 MeV protons was performed. The relationship between target depth (or shield thickness) and displacement damage during proton irradiation was obtained by in situ electrical resistance measurements at 20 K. Positron annihilation lifetime measurements were also performed at room temperature after irradiation, as a function of the target thickness. The displacement damage was found to be high close to the beam incident surface area, and decreased with increasing target depth. The experimental results were compared with damage production calculated with an advanced Monte Carlo particle transport code system (PHITS).

  16. Comparison with simulations to experimental data for photo-neutron reactions using SPring-8 Injector

    NASA Astrophysics Data System (ADS)

    Asano, Yoshihiro

    2017-09-01

    Simulations of photo-nuclear reactions by using Monte Carlo codes PHITS and FLUKA have been performed to compare to the measured data at the SPring-8 injector with 250MeV and 961MeV electrons. Measurement data of Bismuth-206 productions due to photo-nuclear reactions of 209Bi(γ,3n) 206Bi and high energy neutron reactions of 209Bi(n,4n)206 Bi at the beam dumps have been compared with the simulations. Neutron leakage spectra outside the shield wall are also compared between experiments and simulations.

  17. Displacement damage calculations in PHITS for copper irradiated with charged particles and neutrons

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke; Niita, Koji; Sawai, Tomotsugu; Ronningen, R. M.; Baumann, Thomas

    2013-05-01

    The radiation damage model in the Particle and Heavy Ion Transport code System (PHITS) uses screened Coulomb scattering to evaluate the energy of the target primary knock-on atom (PKA) created by the projectile and the “secondary particles,” which include all particles created from the sequential nuclear reactions. We investigated the effect of nuclear reactions on displacement per atom (DPA) values for the following cases using a copper target: (1) 14 and 200 MeV proton incidences, (2) 14 and 200 MeV/nucleon 48Ca incidences, and (3) 14 and 200 MeV and reactor neutrons incidences. For the proton incidences, the ratio of partial DPA created by protons to total decreased with incident proton energy and that by the secondary particles increased with proton energy. For 48Ca beams, DPA created by 48Ca is dominant over the 48Ca range. For the 14 and 200 MeV neutron incidences, the ratio of partial DPA created by the secondary particles increases with incident neutron energy. For the reactor neutrons, copper created by neutron-copper nuclear elastic scattering contributes to the total DPA. These results indicate that inclusion of nuclear reactions and Coulomb scattering are necessary for DPA estimation over a wide energy range from eV to GeV.

  18. Analytical Model for Estimating the Zenith Angle Dependence of Terrestrial Cosmic Ray Fluxes

    PubMed Central

    Sato, Tatsuhiko

    2016-01-01

    A new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 4.0” was developed to facilitate instantaneous estimation of not only omnidirectional but also angular differential energy spectra of cosmic ray fluxes anywhere in Earth’s atmosphere at nearly any given time. It consists of its previous version, PARMA3.0, for calculating the omnidirectional fluxes and several mathematical functions proposed in this study for expressing their zenith-angle dependences. The numerical values of the parameters used in these functions were fitted to reproduce the results of the extensive air shower simulation performed by Particle and Heavy Ion Transport code System (PHITS). The angular distributions of ground-level muons at large zenith angles were specially determined by introducing an optional function developed on the basis of experimental data. The accuracy of PARMA4.0 was closely verified using multiple sets of experimental data obtained under various global conditions. This extension enlarges the model’s applicability to more areas of research, including design of cosmic-ray detectors, muon radiography, soil moisture monitoring, and cosmic-ray shielding calculation. PARMA4.0 is available freely and is easy to use, as implemented in the open-access EXcel-based Program for Calculating Atmospheric Cosmic-ray Spectrum (EXPACS). PMID:27490175

  19. Analytical Model for Estimating the Zenith Angle Dependence of Terrestrial Cosmic Ray Fluxes.

    PubMed

    Sato, Tatsuhiko

    2016-01-01

    A new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 4.0" was developed to facilitate instantaneous estimation of not only omnidirectional but also angular differential energy spectra of cosmic ray fluxes anywhere in Earth's atmosphere at nearly any given time. It consists of its previous version, PARMA3.0, for calculating the omnidirectional fluxes and several mathematical functions proposed in this study for expressing their zenith-angle dependences. The numerical values of the parameters used in these functions were fitted to reproduce the results of the extensive air shower simulation performed by Particle and Heavy Ion Transport code System (PHITS). The angular distributions of ground-level muons at large zenith angles were specially determined by introducing an optional function developed on the basis of experimental data. The accuracy of PARMA4.0 was closely verified using multiple sets of experimental data obtained under various global conditions. This extension enlarges the model's applicability to more areas of research, including design of cosmic-ray detectors, muon radiography, soil moisture monitoring, and cosmic-ray shielding calculation. PARMA4.0 is available freely and is easy to use, as implemented in the open-access EXcel-based Program for Calculating Atmospheric Cosmic-ray Spectrum (EXPACS).

  20. Fragmentation Cross Sections of 290 and 400 MeV/nucleon 12C Beamson Elemental Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, C.; Guetersloh, S.; Heilbronn, L.

    Charge-changing and fragment production cross sections at 0circ have been obtained for interactions of 290 MeV/nucleon and 400MeV/nucleon carbon beams with C, CH2, Al, Cu, Sn, and Pb targets. Thesebeams are relevant to cancer therapy, space radiation, and the productionof radioactive beams. We compare to previously published results using Cand CH2 targets at similar beam energies. Due to ambiguities arising fromthe presence of multiple fragments on many events, previous publicationshave reported only cross sections for B and Be fragments. In this work wehave extracted cross sections for all fragment species, using dataobtained at three distinct values of angular acceptance, supplementedmore » bydata taken with the detector stack placed off the beam axis. A simulationof the experiment with the PHITS Monte Carlo code shows fair agreementwith the data obtained with the large acceptance detectors, but agreementis poor at small acceptance. The measured cross sections are alsocompared to the predictions of the one-dimensional cross section modelsEPAX2 and NUCFRG2; the latter is presently used in NASA's space radiationtransport calculations. Though PHITS and NUCFRG2 reproduce thecharge-changing cross sections with reasonable accuracy, none of themodels is able to accurately predict the fragment cross sections for allfragment species and target materials.« less

  1. Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.

    PubMed

    Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao

    2012-08-01

    Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.

  2. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample

    PubMed Central

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-01-01

    Abstract To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site. PMID:29385528

  3. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample.

    PubMed

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-05-01

    To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site.

  4. Electron linear accelerator production and purification of scandium-47 from titanium dioxide targets.

    PubMed

    Rotsch, David A; Brown, M Alex; Nolen, Jerry A; Brossard, Thomas; Henning, Walter F; Chemerisov, Sergey D; Gromov, Roman G; Greene, John

    2018-01-01

    The photonuclear production of no-carrier-added (NCA) 47 Sc from solid Nat TiO 2 and the subsequent chemical processing and purification have been developed. Scandium-47 was produced by the 48 Ti(γ,p) 47 Sc reaction with Bremsstrahlung photons produced from the braking of electrons in a high-Z (W or Ta) convertor. Production yields were simulated with the PHITS code (Particle and Heavy Ion Transport-code System) and compared to experimental results. Irradiated TiO 2 targets were dissolved in fuming H 2 SO 4 in the presence of Na 2 SO 4 and 47 Sc was purified using the commercially available Eichrom DGA resin. Typical 47 Sc recovery yields were >90% with excellent specific activity for small batches (<185 MBq batches). Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Electron linear accelerator production and purification of scandium-47 from titanium dioxide targets

    DOE PAGES

    Rotsch, David A.; Brown, M. Alex; Nolen, Jerry A.; ...

    2017-11-06

    Here, the photonuclear production of no-carrier-added (NCA) 47Sc from solid NatTiO 2 and the subsequent chemical processing and purification have been developed. Scandium-47 was produced by the 48Ti(γ,p) 47Sc reaction with Bremsstrahlung photons produced from the braking of electrons in a high-Z (W or Ta) convertor. Production yields were simulated with the PHITS code (Particle and Heavy Ion Transport-code System) and compared to experimental results. Irradiated TiO 2 targets were dissolved in fuming H 2SO 4 in the presence of Na 2SO 4 and 47Sc was purified using the commercially available Eichrom DGA resin. Typical 47Sc recovery yields were >90%more » with excellent specific activity for small batches (<185 MBq batches).« less

  6. Electron linear accelerator production and purification of scandium-47 from titanium dioxide targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rotsch, David A.; Brown, M. Alex; Nolen, Jerry A.

    Here, the photonuclear production of no-carrier-added (NCA) 47Sc from solid NatTiO 2 and the subsequent chemical processing and purification have been developed. Scandium-47 was produced by the 48Ti(γ,p) 47Sc reaction with Bremsstrahlung photons produced from the braking of electrons in a high-Z (W or Ta) convertor. Production yields were simulated with the PHITS code (Particle and Heavy Ion Transport-code System) and compared to experimental results. Irradiated TiO 2 targets were dissolved in fuming H 2SO 4 in the presence of Na 2SO 4 and 47Sc was purified using the commercially available Eichrom DGA resin. Typical 47Sc recovery yields were >90%more » with excellent specific activity for small batches (<185 MBq batches).« less

  7. A Model-Based Architecture Approach to Ship Design Linking Capability Needs to System Solutions

    DTIC Science & Technology

    2012-06-01

    NSSM NATO Sea Sparrow Missile RAM Rolling Airframe Missile CIWS Close-In Weapon System 3D Three Dimensional Ps Probability of Survival PHit ...example effectiveness model. The primary MOP is the inverse of the probability of taking a hit (1- PHit ), which in, this study, will be referred to as

  8. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS

    NASA Astrophysics Data System (ADS)

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Brown, Justin L.; Bolch, Wesley E.

    2017-06-01

    A new function to treat tetrahedral-mesh geometry was implemented in the particle and heavy ion transport code systems. To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  9. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS.

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Brown, Justin L; Bolch, Wesley E

    2017-06-21

    A new function to treat tetrahedral-mesh geometry was implemented in the particle and heavy ion transport code systems. To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  10. Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System

    NASA Astrophysics Data System (ADS)

    Aizawa, Naoto; Iwasaki, Tomohiko

    2014-06-01

    Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.

  11. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN.

    PubMed

    Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E

    2013-10-21

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  12. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN

    NASA Astrophysics Data System (ADS)

    Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.

    2013-10-01

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  13. Robust Flight Controllers.

    DTIC Science & Technology

    1983-12-01

    C TO c:.’jrPTE --L773Z 1tNT - A P TO PICK( :T JTZCTLY3, 172. IF (PSG .ED* C 9) TN C17?6: CALL O’(T; ( PHIT , 37,GTD, OTO GCX, T1M, GCY, GCZ, OhlC...IRHN 16 IFLGCZzQ C 17610 ____CALL ’AT1(OTDBCYvXV0fn, ;s,~MC~)1762t CALL "AO01 11RPM, IFH, PHIT ,GCSTR,9CY ,CI) 0 17 64;- C SCYS’ PfiTdtOICTR3 1RFM X...C PHIC u( PHIT -!TO(GCSTR)(I-RKFSS(Hn)) 1RFM X IP� CALL nAT1(9CY,4FSSIrKFMIRPp4HM.6pCZ) 01777C C CZ PIT- ETDIGCST))RFSS IRFP X IRt4M C1778;’ ELSE 07g

  14. A personal health information toolkit for health intervention research.

    PubMed

    Kizakevich, Paul N; Eckhoff, Randall; Weger, Stacey; Weeks, Adam; Brown, Janice; Bryant, Stephanie; Bakalov, Vesselina; Zhang, Yuying; Lyden, Jennifer; Spira, James

    2014-01-01

    With the emergence of mobile health (mHealth) apps, there is a growing demand for better tools for developing and evaluating mobile health interventions. Recently we developed the Personal Health Intervention Toolkit (PHIT), a software framework which eases app implementation and facilitates scientific evaluation. PHIT integrates self-report and physiological sensor instruments, evidence-based advisor logic, and self-help interventions such as meditation, health education, and cognitive behavior change. PHIT can be used to facilitate research, interventions for chronic diseases, risky behaviors, sleep, medication adherence, environmental monitoring, momentary data collection health screening, and clinical decision support. In a series of usability evaluations, participants reported an overall usability score of 4.5 on a 1-5 Likert scale and an 85 score on the System Usability Scale, indicating a high percentile rank of 95%.

  15. Characteristic evaluation of a Lithium-6 loaded neutron coincidence spectrometer.

    PubMed

    Hayashi, M; Kaku, D; Watanabe, Y; Sagara, K

    2007-01-01

    Characteristics of a (6)Li-loaded neutron coincidence spectrometer were investigated from both measurements and Monte Carlo simulations. The spectrometer consists of three (6)Li-glass scintillators embedded in a liquid organic scintillator BC-501A, which can detect selectively neutrons that deposit the total energy in the BC-501A using a coincidence signal generated from the capture event of thermalised neutrons in the (6)Li-glass scintillators. The relative efficiency and the energy response were measured using 4.7, 7.2 and 9.0 MeV monoenergetic neutrons. The measured ones were compared with the Monte Carlo calculations performed by combining the neutron transport code PHITS and the scintillator response calculation code SCINFUL. The experimental light output spectra were in good agreement with the calculated ones in shape. The energy dependence of the detection efficiency was reproduced by the calculation. The response matrices for 1-10 MeV neutrons were finally obtained.

  16. Analysis of dose-LET distribution in the human body irradiated by high energy hadrons.

    PubMed

    Sato, T; Tsuda, S; Sakamoto, Y; Yamaguchi, Y; Niita, K

    2003-01-01

    For the purposes of radiological protection, it is important to analyse profiles of the particle field inside a human body irradiated by high energy hadrons, since they can produce a variety of secondary particles which play an important role in the energy deposition process, and characterise their radiation qualities. Therefore Monte Carlo calculations were performed to evaluate dose distributions in terms of the linear energy transfer of ionising particles (dose-LET distribution) using a newly developed particle transport code (Particle and Heavy Ion Transport code System, PHITS) for incidences of neutrons, protons and pions with energies from 100 MeV to 200 GeV. Based on these calculations, it was found that more than 80% and 90% of the total deposition energies are attributed to ionisation by particles with LET below 10 keV microm(-1) for the irradiations of neutrons and the charged particles, respectively.

  17. Calculation of energy-deposition distributions and microdosimetric estimation of the biological effect of a 9C beam.

    PubMed

    Mancusi, Davide; Sihver, Lembit; Niita, Koji; Li, Qiang; Sato, Tatsuhiko; Iwase, Hiroshi; Iwamoto, Yosuke; Matsuda, Norihiro; Sakamoto, Yukio; Nakashima, Hiroshi

    2009-04-01

    Among the alternative beams being recently considered for external cancer radiotherapy, (9)C has received some attention because it is expected that its biological effectiveness could be boosted by the beta-delayed emission of two alpha particles and a proton that takes place at the ion-stopping site. Experiments have been performed to characterise this exotic beam physically and models have been developed to estimate quantitatively its biological effect. Here, the particle and heavy-ion transport code system ( PHITS ) is used to calculate energy-deposition and linear energy transfer distributions for a (9)C beam in water and the results are compared with published data. Although PHITS fails to reproduce some of the features of the distributions, it suggests that the decay of (9)C contributes negligibly to the energy-deposition distributions, thus contradicting the previous interpretation of the measured data. We have also performed a microdosimetric calculation to estimate the biological effect of the decay, which was found to be negligible; previous microdosimetric Monte-Carlo calculations were found to be incorrect. An analytical argument, of geometrical nature, confirms this conclusion and gives a theoretical upper bound on the additional biological effectiveness of the decay. However, no explanation can be offered at present for the observed difference in the biological effectiveness between (9)C and (12)C; the reproducibility of this surprising result will be verified in coming experiments.

  18. Measurements and PHITS Monte Carlo Estimations of Residual Activities Induced by the 181 MeV Proton Beam in the Injection Area at J-PARC RCS Ring

    NASA Astrophysics Data System (ADS)

    Yamakawa, Emi; Yoshimoto, Masahiro; Kinsho, Michikazu

    At the injection area of the RCS ring in the J-PARC, residual gamma dose at the rectangular ceramic ducts, especially immediately downstream of the charge-exchanged foil, has increased with the output beam power. In order to investigate the cause of high residual activities, residual gamma dose and radioactive sources produced at the exterior surface of the ducts have been measured by a GM survey meter and a handy type of Germanium (Ge) semiconductor detector in the case of 181 MeV injected proton beam energy. With these measurements, it is revealed that the radioactive sources produced by nuclear reactions cause the high activities at the injection area. For a better understanding of phenomena in the injection area, various simulations have been done with the PHITS Monte Carlo code. The distribution of radioactive sources and residual gamma dose rate obtained by the calculations are consistent with the measurement results. With this consistency, secondary neutrons and protons derived from nuclear reactions at the charge-exchanged foil are the dominant cause to high residual gamma dose at the ceramic ducts in the injection area. These measurements and calculations are unique approaches to reveal the cause of high residual dose around the foil. This study is essential for the future of high-intensity proton accelerators using a stripping foil.

  19. Validation of PHITS Spallation Models from the Perspective of the Shielding Design of Transmutation Experimental Facility

    NASA Astrophysics Data System (ADS)

    Iwamoto, Hiroki; Meigo, Shin-ichiro

    2017-09-01

    The impact of different spallation models implemented in the particle transport code PHITS on the shielding design of Transmutation Experimental Facility is investigated. For 400-MeV proton incident on a lead-bismuth eutectic target, an effective dose rate at the end of a thick radiation shield (3-m-thick iron and 3-m-thick concrete) calculated by the Liège intranuclear cascade (INC) model version 4.6 (INCL4.6) coupled with the GEMcode (INCL4.6/GEM) yields about twice as high as the Bertini INC model (Bertini/GEM). A comparison with experimental data for 500-MeV proton incident on a thick lead target suggest that the prediction accuracy of INCL4.6/GEM would be better than that of Bertini/GEM. In contrast, it is found that the dose rates in beam ducts in front of targets calculated by the INCL4.6/GEMare lower than those by the Bertini/GEM. Since both models underestimate the experimental results for neutron-production doubledifferential cross sections at 180° for 140-MeV proton incident on carbon, iron, and gold targets, it is concluded that it is necessary to allow a margin for uncertainty caused by the spallation models, which is a factor of two, in estimating the dose rate induced by neutron streaming through a beam duct.

  20. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    PubMed

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  1. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS

    PubMed Central

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183

  2. Neutron production cross sections for (d,n) reactions at 55 MeV

    NASA Astrophysics Data System (ADS)

    Wakasa, T.; Goto, S.; Matsuno, M.; Mitsumoto, S.; Okada, T.; Oshiro, H.; Sakaguchi, S.

    2017-08-01

    The cross sections for (d,n) reactions on {}^natC-{}^{197}Au have been measured at a bombarding energy of 55 MeV and a laboratory scattering angle of θ_lab = 9.5°. The angular distributions for the {}^natC(d,n) reaction have also been obtained at θ_lab = 0°-40°. The neutron energy spectra are dominated by deuteron breakup contributions and their peak positions can be reasonably reproduced by considering the Coulomb force effects. The data are compared with the TENDL-2015 nuclear data and Particle and Heavy Ion Transport code System (PHITS) calculations. Both calculations fail to reproduce the measured energy spectra and angular distributions.

  3. Review of the Microdosimetric Studies for High-Energy Charged Particle Beams Using a Tissue-Equivalent Proportional Counter

    NASA Astrophysics Data System (ADS)

    Tsuda, Shuichi; Sato, Tatsuhiko; Ogawa, Tatsuhiko; Sasaki, Shinichi

    Lineal energy (y) distributions were measured for various types of charged particles such as protons and iron, with kinetic energies of up to 500 MeV/u, via the use of a wall-less tissue-equivalent proportional counter (TEPC). Radial dependencies of y distributions were also experimentally evaluated to investigate the track structures of protons, carbon, and iron beams. This paper reviews a series of measured data using the aforementioned TEPC as well as assesses the systematic verification of a microdosimetric calculation model of a y distribution incorporated into the particle and heavy ion transport code system (PHITS) and associated track structure models.

  4. Study on optimization of multiionization-chamber system for BNCT.

    PubMed

    Fujii, T; Tanaka, H; Maruhashi, A; Ono, K; Sakurai, Y

    2011-12-01

    In order to monitor stability of doses from the four components such as thermal, epi-thermal, fast neutron and gamma-ray during BNCT irradiation, we are developing a multiionization-chamber system. This system is consisted of four kinds of ionization chamber, which have specific sensitivity for each component, respectively. Since a suitable structure for each chamber depends on the energy spectrum of the irradiation field, the optimization study of the chamber structures for the epi-thermal neutron beam of cyclotron-based epi-thermal neutron source (C-BENS) was performed by using a Monte Carlo simulation code "PHITS" and suitable chamber-structures were determined. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. A method for radiological characterization based on fluence conversion coefficients

    NASA Astrophysics Data System (ADS)

    Froeschl, Robert

    2018-06-01

    Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.

  6. An AI approach for scheduling space-station payloads at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Castillo, D.; Ihrie, D.; Mcdaniel, M.; Tilley, R.

    1987-01-01

    The Payload Processing for Space-Station Operations (PHITS) is a prototype modeling tool capable of addressing many Space Station related concerns. The system's object oriented design approach coupled with a powerful user interface provide the user with capabilities to easily define and model many applications. PHITS differs from many artificial intelligence based systems in that it couples scheduling and goal-directed simulation to ensure that on-orbit requirement dates are satisfied.

  7. Space Radiation Transport Code Development: 3DHZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and PHITS are provided for a variety of boundary conditions and geometries. Improvements provided by the 3D corrections are made clear in the comparisons. Developments needed to connect 3DHZETRN to vehicle design and optimization studies will be discussed. Future theoretical development will relax the forward plus isotropic interaction assumption to more general angular dependence.

  8. Data Report for an Extensive Store Separation Test Program Conducted at Supersonic Speeds.

    DTIC Science & Technology

    1979-12-01

    PBn)/IDPI3/PBnil (Note: T PHIT ) 203 ALPLn - Flow angle based on total angle of attack and rolln angle, deg ALPLn TAN-[Tan(aT) * cos(T)] + [TANG(n...PB)yaw - CCORR1 - ACORR2 • SIN( PHIT n )] + TANG(n)yaw CCORRI = CIDP24/PBni 2.5 and C = (0.5)2.5 • 318.5 M 5 • (DP2 4/PBn )/IDP24/PBn TANG(n)yaw = Angle

  9. An open, interoperable, and scalable prehospital information technology network architecture.

    PubMed

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  10. SU-F-T-376: The Efficiency of Calculating Photonuclear Reaction On High-Energy Photon Therapy by Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirayama, S; Fujibuchi, T

    Purpose: Secondary-neutrons having harmful influences to a human body are generated by photonuclear reaction on high-energy photon therapy. Their characteristics are not known in detail since the calculation to evaluate them takes very long time. PHITS(Particle and Heavy Ion Transport code System) Monte Carlo code since versions 2.80 has the new parameter “pnimul” raising the probability of occurring photonuclear reaction forcibly to make the efficiency of calculation. We investigated the optimum value of “pnimul” on high-energy photon therapy. Methods: The geometry of accelerator head based on the specification of a Varian Clinac 21EX was used for PHITS ver. 2.80. Themore » phantom (30 cm * 30 cm * 30 cm) filled the composition defined by ICRU(International Commission on Radiation Units) was placed at source-surface distance 100 cm. We calculated the neutron energy spectra in the surface of ICRU phantom with “pnimal” setting 1, 10, 100, 1000, 10000 and compared the total calculation time and the behavior of photon using PDD(Percentage Depth Dose) and OCR(Off-Center Ratio). Next, the cutoff energy of photon, electron and positron were investigated for the calculation efficiency with 4, 5, 6 and 7 MeV. Results: The calculation total time until the errors of neutron fluence become within 1% decreased as increasing “pnimul”. PDD and OCR showed no differences by the parameter. The calculation time setting the cutoff energy like 4, 5, 6 and 7 MeV decreased as increasing the cutoff energy. However, the errors of photon become within 1% did not decrease by the cutoff energy. Conclusion: The optimum values of “pnimul” and the cutoff energy were investigated on high-energy photon therapy. It is suggest that using the optimum “pnimul” makes the calculation efficiency. The study of the cutoff energy need more investigation.« less

  11. Application of new nuclear de-excitation model of PHITS for prediction of isomer yield and prompt gamma-ray production

    NASA Astrophysics Data System (ADS)

    Ogawa, Tatsuhiko; Hashimoto, Shintaro; Sato, Tatsuhiko; Niita, Koji

    2014-06-01

    A new nuclear de-excitation model, intended for accurate simulation of isomeric transition of excited nuclei, was incorporated into PHITS and applied to various situations to clarify the impact of the model. The case studies show that precise treatment of gamma de-excitation and consideration for isomer production are important for various applications such as detector performance prediction, radiation shielding calculations and the estimation of radioactive inventory including isomers.

  12. Portable Holographic Interferometry Testing System: Application to crack patching quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heslehurst, R.B.; Baird, J.P.; Williamson, H.M.

    Over recent years the repair of metallic structures has been improved through the use of patches fabricated from composite materials and adhesively bonded to the damaged area. This technology is termed crack patching, and has been successfully and extensively used by the RAAF and the USAF. However, application of the technology to civilian registered aircraft has had limited success due to the apparent lack of suitable quality assurance testing methods and the airworthiness regulators concern overpatch adhesion integrity. Holographic interferometry has previously shown the advantages of detecting out-of-plane deformations of the order of the wavelength of light (1{mu}). Evidence willmore » be presented that holography is able to detect changes in load path due to debonds and weakened adhesion in an adhesively bonded patch. A Portable Holographic Interferometry Testing System (PHITS) which overcomes the vibration isolation problem associated with conventional holography techniques has been developed. The application of PHITS to crack patching technology now provides a suitable method to verify the integrity of bonded patches in-situ.« less

  13. Measurement and simulation of the cross sections for the production of {sup 148}Gd in thin {sup nat}W and {sup 181}Ta targets irradiated with 0.4- to 2.6-GeV protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titarenko, Yu. E., E-mail: Yury.Titarenko@itep.ru; Batyaev, V. F.; Titarenko, A. Yu.

    The cross sections for the production of {sup 148}Gd in {sup nat}W and {sup 181}Ta targets irradiated by 0.4-, 0.6-, 0.8-, 1.2-, 1.6-, and 2.6-GeV protons at the ITEP accelerator complex have been measured by direct {alpha} spectrometry without chemical separation. The experimental data have been compared with the data obtained at other laboratories and with the theoretical simulations of the yields on the basis of the BERTINI, ISABEL, CEM03.02, INCL4.2, INCL4.5, CASCADE07, and PHITS codes.

  14. Radiological characteristics of MRI-based VIP polymer gel under carbon beam irradiation

    NASA Astrophysics Data System (ADS)

    Maeyama, T.; Fukunishi, N.; Ishikawa, K. L.; Furuta, T.; Fukasaku, K.; Takagi, S.; Noda, S.; Himeno, R.; Fukuda, S.

    2015-02-01

    We study the radiological characteristics of VIP polymer gel dosimeters under carbon beam irradiation with energy of 135 and 290 AMeV. To evaluate dose response of VIP polymer gels, the transverse (or spin-spin) relaxation rate R2 of the dosimeters measured by magnetic resonance imaging (MRI) as a function of linear energy transfer (LET), rather than penetration depth, as is usually done in previous reports. LET is evaluated by use of the particle transport simulation code PHITS. Our results reveal that the dose response decreases with increasing dose-averaged LET and that the dose response-LET relation also varies with incident carbon beam energy. The latter can be explained by taking into account the contribution from fragmentation products.

  15. Measurement of activation of helium gas by 238U beam irradiation at about 11 A MeV

    NASA Astrophysics Data System (ADS)

    Akashio, A.; Tanaka, K.; Imao, H.; Uwamino, Y.

    2017-09-01

    A new helium-gas stripper system has been applied at the 11 A MeV uranium beam of the Radioactive Isotope Beam Factory of the RIKEN accelerator facility. Although the gas stripper is important for the heavy-ion accelerator facility, the residual radiation that is generated is a serious problem for maintenance work. The residual dose was evaluated by using three-layered activation samples of aluminium and bismuth. The γ-rays from produced radionuclides with in-flight fission of the 238U beam and from the material of the chamber activated by neutrons were observed by using a Ge detector and compared with the values calculated by using the Monte-Carlo simulation code PHITS.

  16. Double differential neutron spectra generated by the interaction of a 12 MeV/nucleon 36S beam on a thick natCu target

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Ledoux, X.; Laurent, B.; Thomas, J.-C.; Clerc, T.; Desmezières, V.; Dupuis, M.; Madeline, A.; Dessay, E.; Grinyer, G. F.; Grinyer, J.; Menard, N.; Porée, F.; Achouri, L.; Delaunay, F.; Parlog, M.

    2018-07-01

    Double differential neutron spectra (energy, angle) originating from a thick natCu target bombarded by a 12 MeV/nucleon 36S16+ beam were measured by the activation method and the Time-of-flight technique at the Grand Accélérateur National d'Ions Lourds (GANIL). A neutron spectrum unfolding algorithm combining the SAND-II iterative method and Monte-Carlo techniques was developed for the analysis of the activation results that cover a wide range of neutron energies. It was implemented into a graphical user interface program, called GanUnfold. The experimental neutron spectra are compared to Monte-Carlo simulations performed using the PHITS and FLUKA codes.

  17. Monte Carlo calculations of positron emitter yields in proton radiotherapy.

    PubMed

    Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F

    2012-03-21

    Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. © 2012 Institute of Physics and Engineering in Medicine

  18. The Martian surface radiation environment - a comparison of models and MSL/RAD measurements

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Ehresmann, Bent; Lohf, Henning; Köhler, Jan; Zeitlin, Cary; Appel, Jan; Sato, Tatsuhiko; Slaba, Tony; Martin, Cesar; Berger, Thomas; Boehm, Eckart; Boettcher, Stephan; Brinza, David E.; Burmeister, Soenke; Guo, Jingnan; Hassler, Donald M.; Posner, Arik; Rafkin, Scot C. R.; Reitz, Günther; Wilson, John W.; Wimmer-Schweingruber, Robert F.

    2016-03-01

    Context: The Radiation Assessment Detector (RAD) on the Mars Science Laboratory (MSL) has been measuring the radiation environment on the surface of Mars since August 6th 2012. MSL-RAD is the first instrument to provide detailed information about charged and neutral particle spectra and dose rates on the Martian surface, and one of the primary objectives of the RAD investigation is to help improve and validate current radiation transport models. Aims: Applying different numerical transport models with boundary conditions derived from the MSL-RAD environment the goal of this work was to both provide predictions for the particle spectra and the radiation exposure on the Martian surface complementing the RAD sensitive range and, at the same time, validate the results with the experimental data, where applicable. Such validated models can be used to predict dose rates for future manned missions as well as for performing shield optimization studies. Methods: Several particle transport models (GEANT4, PHITS, HZETRN/OLTARIS) were used to predict the particle flux and the corresponding radiation environment caused by galactic cosmic radiation on Mars. From the calculated particle spectra the dose rates on the surface are estimated. Results: Calculations of particle spectra and dose rates induced by galactic cosmic radiation on the Martian surface are presented. Although good agreement is found in many cases for the different transport codes, GEANT4, PHITS, and HZETRN/OLTARIS, some models still show large, sometimes order of magnitude discrepancies in certain particle spectra. We have found that RAD data is helping to make better choices of input parameters and physical models. Elements of these validated models can be applied to more detailed studies on how the radiation environment is influenced by solar modulation, Martian atmosphere and soil, and changes due to the Martian seasonal pressure cycle. By extending the range of the calculated particle spectra with respect to the experimental data additional information about the radiation environment is gained, and the contribution of different particle species to the dose is estimated.

  19. Intercomparison measurements with energy deposition spectrometer Liulin and TEPC Hawk at HIMAC, and related calculations with PHITS

    NASA Astrophysics Data System (ADS)

    Ploc, Ondrej; Uchihori, Yukio; Kitamura, H.; Kodaira, S.; Dachev, Tsvetan; Spurny, Frantisek; Jadrnickova, Iva; Mrazova, Zlata; Kubancak, Jan

    Liulin type detectors are recently used in a wide range of cosmic radiation measurements, e.g. at alpine observatories, onboard aircrafts and spacecrafts. They provide energy deposition spectra up to 21 MeV, higher energy deposition events are stored in the last (overflow) channel. Their main advantages are portability (about the same size as a pack of cigarettes) and ability to record spectra as a function of time, so they can be used as personal dosimeters. Their well-known limitations are: (i) the fact that they are not tissue equivalent, (ii) they can be used as LET spectrometer only under specific conditions (e.g. broad parallel beam), and (iii) that the energy deposition event from particles of LETH20¿35 keV/µm is stored in the overflow bin only so the spectral information is missing. Tissue equivalent proportional counter (TEPC) Hawk has no of these limitations but on the other hand, it cannot be used as personal dosimeter because of its big size (cylinder of 16 cm diameter and 34 cm long). An important fraction of dose equivalent onboard spacecrafts is caused by heavy ions. This contribution presents results from intercomparison measurements with Liulin and Hawk at Heavy Ion Medical Accelerator in Chiba (HIMAC) and cyclotron beams, and related calculations with PHITS (Particle and Heavy-ion Transport code System). Following particles/ions and energies were used: protons 70 MeV, He 150 MeV, Ne 400 MeV, C 135 MeV, C 290 MeV, and Fe 500 MeV. Calculations of LET spectra by PHITS were performed for both, Liulin and Hawk. In case of Liulin, the dose equivalent was calculated using simulations in which several tissue equivalent materials were used as active volume instead of the silicon diode. Dose equivalents calculated in such way was compared with that measured with Hawk. LET spectra measured with Liulin and Hawk were compared for each ion at several points behind binary filters along the Brag curve. Good agreement was observed for some configurations; for the other configurations, the difference was reasonably described (e.g. thickness of stainless steel of TEPC wall and size of Hawk's active volume).

  20. Measurement of 100- and 290-MeV/A Carbon Incident Neutron Production Cross Sections for Carbon, Nitrogen and Oxygen

    NASA Astrophysics Data System (ADS)

    Shigyo, N.; Uozumi, U.; Uehara, H.; Nishizawa, T.; Mizuno, T.; Takamiya, M.; Hashiguchi, T.; Satoh, D.; Sanami, T.; Koba, Y.; Takada, M.; Matsufuji, N.

    2014-05-01

    Neutron double-differential cross sections from carbon ion incident on carbon, nitrogen and oxygen targets have been measured for neutron energies down to 0.6 MeV in wide directions from 15∘ to 90∘ with 100- and 290-MeV/A incident energies at the Heavy Ion Medical Accelerator in Chiba (HIMAC), National Institute of Radiological Sciences. Two sizes of NE213 scintillators were used as neutron detectors in order to enable neutron energy from below one to several hundred MeV. The neutron energy was measured by the time-of-flight technique between the beam pickup detector and an NE213 scintillator. By using the experimental data, the validity of the calculation results by the PHITS code was examined.

  1. Thick-target transmission method for excitation functions of interaction cross sections

    NASA Astrophysics Data System (ADS)

    Aikawa, M.; Ebata, S.; Imai, S.

    2016-09-01

    We propose a method, called as thick-target transmission (T3) method, to obtain an excitation function of interaction cross sections. In an ordinal experiment to measure the excitation function of interaction cross sections by the transmission method, we need to change the beam energy for each cross section. In the T3 method, the excitation function is derived from the beam attenuations measured at the targets of different thicknesses without changing the beam energy. The advantage of the T3 method is the simplicity and availability for radioactive beams. To confirm the availability, we perform a simulation for the 12C + 27Al system with the PHITS code instead of actual experiments. Our results have large uncertainties but well reproduce the tendency of the experimental data.

  2. Analytical functions to predict cosmic-ray neutron spectra in the atmosphere.

    PubMed

    Sato, Tatsuhiko; Niita, Koji

    2006-09-01

    Estimation of cosmic-ray neutron spectra in the atmosphere has been an essential issue in the evaluation of the aircrew doses and the soft-error rates of semiconductor devices. We therefore performed Monte Carlo simulations for estimating neutron spectra using the PHITS code in adopting the nuclear data library JENDL-High-Energy file. Excellent agreements were observed between the calculated and measured spectra for a wide altitude range even at the ground level. Based on a comprehensive analysis of the simulation results, we propose analytical functions that can predict the cosmic-ray neutron spectra for any location in the atmosphere at altitudes below 20 km, considering the influences of local geometries such as ground and aircraft on the spectra. The accuracy of the analytical functions was well verified by various experimental data.

  3. Radiation Environment Inside Spacecraft

    NASA Technical Reports Server (NTRS)

    O'Neill, Patrick

    2015-01-01

    Dr. Patrick O'Neill, NASA Johnson Space Center, will present a detailed description of the radiation environment inside spacecraft. The free space (outside) solar and galactic cosmic ray and trapped Van Allen belt proton spectra are significantly modified as these ions propagate through various thicknesses of spacecraft structure and shielding material. In addition to energy loss, secondary ions are created as the ions interact with the structure materials. Nuclear interaction codes (FLUKA, GEANT4, HZTRAN, MCNPX, CEM03, and PHITS) transport free space spectra through different thicknesses of various materials. These "inside" energy spectra are then converted to Linear Energy Transfer (LET) spectra and dose rate - that's what's needed by electronics systems designers. Model predictions are compared to radiation measurements made by instruments such as the Intra-Vehicular Charged Particle Directional Spectrometer (IV-CPDS) used inside the Space Station, Orion, and Space Shuttle.

  4. Detector Calibration to Spontaneous Fission for the Study of Superheavy Elements Using Gas-Filled Recoil Ion Separator

    NASA Astrophysics Data System (ADS)

    Takeyama, Mirei; Kaji, Daiya; Morimoto, Kouji; Wakabayashi, Yasuo; Tokanai, Fuyuki; Morita, Kosuke

    Detector response to spontaneous fission (SF) of heavy nuclides produced in the 206Pb(48Ca,2n)252No reaction was investigated using a gas-filled recoil ion separator (GARIS). Kinetic energy distributions of the SF originating from 252No were observed by tuning implantation depth of evaporation residue (ER) to the detector. The focal plane detector used in the GARIS experiments was well calibrated by comparing with the known total kinetic energy (TKE) of SF due to 252No. The correction value for the TKE calculation was deduced as a function of the implantation depth of 252No to the detector. Furthermore, we have investigated the results by comparing with those obtained by a computer simulation using the particle and heavy ion transport code system (PHITS).

  5. Conversion coefficients from fluence to effective dose for heavy ions with energies up to 3 GeV/A.

    PubMed

    Sato, T; Tsuda, S; Sakamoto, Y; Yamaguchi, Y; Niita, K

    2003-01-01

    Radiological protection against high-energy heavy ions has been an essential issue in the planning of long-term space missions. The fluence to effective dose conversion coefficients have been calculated for heavy ions using the particle and heavy ion transport code system PHITS coupled with an anthropomorphic phantom of the MIRD5 type. The calculations were performed for incidences of protons and typical space heavy ions--deuterons, tritons, 3He, alpha particles, 12C, 20Ne, 40Ar, 40Ca and 56Fe--with energies up to 3 GeV/A in the isotropic and anterior-posterior irradiation geometries. A simple fitting formula that can predict the effective dose from almost all kinds of space heavy ions below 3 GeV/A within an accuracy of 30% is deduced from the results.

  6. Response function of a superheated drop neutron monitor with lead shell in the thermal to 400-MeV energy range.

    PubMed

    Itoga, Toshiro; Asano, Yoshihiro; Tanimura, Yoshihiko

    2011-07-01

    Superheated drop detectors are currently used for personal and environmental dosimetry and their characteristics such as response to neutrons and temperature dependency are well known. A new bubble counter based on the superheated drop technology has been developed by Framework Scientific. However, the response of this detector with the lead shell is not clear especially above several tens of MeV. In this study, the response has been measured with quasi-monoenergetic and monoenergetic neutron sources with and without a lead shell. The experimental results were compared with the results of the Monte Carlo calculations using the 'Event Generator Mode' in the PHITS code with the JENDL-HE/2007 data library to clarify the response of this detector with a lead shell in the entire energy range.

  7. Optimization of a ΔE - E detector for 41Ca AMS

    NASA Astrophysics Data System (ADS)

    Hosoya, Seiji; Sasa, Kimikazu; Matsunaka, Tetsuya; Takahashi, Tsutomu; Matsumura, Masumi; Matsumura, Hiroshi; Sundquist, Mark; Stodola, Mark; Sueki, Keisuke

    2017-09-01

    A series of nuclides (14C, 26Al, and 36Cl) was measured using the 12UD Pelletron tandem accelerator before replacement by the horizontal 6 MV tandem accelerator at the University of Tsukuba Tandem Accelerator Complex (UTTAC). This paper considers the modification of the accelerator mass spectrometry (AMS) measurement parameters to suit the current 6 MV tandem accelerator setup (e.g., terminal voltage, detected ion charge state, gas pressure, and entrance window material in detector). The Particle and Heavy Ion Transport code System (PHITS) was also used to simulate AMS measurement to determine the best conditions to suppress isobaric interference. The spectra of 41Ca and 41K were then successfully separated and their nuclear spectra were identified; the system achieved a background level of 41Ca/40Ca ∼ 6 ×10-14 .

  8. Design study of multi-imaging plate system for BNCT irradiation field at Kyoto university reactor.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Kajimoto, Tsuyoshi; Tanaka, Hiroki; Takata, Takushi; Endo, Satoru

    2016-09-01

    The converter configuration for a multi-imaging plate system was investigated for the application of quality assurance in the irradiation field profile for boron neutron capture therapy. This was performed by the simulation calculation using the PHITS code in the fields at the Heavy Water Neutron Irradiation Facility of Kyoto University Reactor. The converter constituents investigated were carbon for gamma rays, and polyethylene with and without LiF at varied (6)Li concentration for thermal, epithermal, and fast neutrons. Consequently, potential combinations of the converters were found for two components, gamma rays and thermal neutrons, for the standard thermal neutron mode and three components of gamma rays, epithermal neutrons, and thermal or fast neutrons, for the standard mixed or epithermal neutron modes, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Attempt to Measure (n, xn) Double-Differential Cross Sections for Incident Neutron Energies above 100 MeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Kunieda, S.; Shigyo, N.

    The experimental technique for measurement of (n, xn) double differential cross sections for incident neutron energy above 100 MeV has been attempted to be developed with continuous-energy neutrons up to 400 MeV. Neutrons were produced in the spallation reaction by the 800 MeV proton beam, which was incident on a thick, heavily shielded tungsten target at the WNR facility at Los Alamos National Laboratory. The energies of incident neutrons were determined by the time-of-flight method. Emitted neutrons were detected by the recoil proton method. A phoswich detector consisting of NaI(Tl) and NE102A plastic scintillators was used for detecting recoil protons.more » We compared the preliminary experimental cross section data with the calculations by PHITS and QMD codes.« less

  10. Neutron production in deuteron-induced reactions on Li, Be, and C at an incident energy of 102 MeV

    NASA Astrophysics Data System (ADS)

    Araki, Shouhei; Watanabe, Yukinobu; Kitajima, Mizuki; Sadamatsu, Hiroki; Nakano, Keita; Kin, Tadahiro; Iwamoto, Yosuke; Satoh, Daiki; Hagiwara, Masayuki; Yashima, Hiroshi; Shima, Tatsushi

    2017-09-01

    Double-differential cross sections (DDXs) of deuteron-induced neutron production reactions on Li, Be, and C at 102 MeV were measured at forward angles (≤ 25∘) by means of a time of flight method with NE213 liquid organic scintillators at the Research Center of Nuclear Physics, Osaka University. The experimental results were compared with model calculations with PHITS and DEURACS. The DEURACS calculation reproduces the experimental DDXs for C at very forward angles than the PHITS one. Moreover, the incident energy dependence of the Li(d,xn) reaction was investigated by adding the DDX data measured previously at 25 and 40 MeV.

  11. Cell survival fraction estimation based on the probability densities of domain and cell nucleus specific energies using improved microdosimetric kinetic models.

    PubMed

    Sato, Tatsuhiko; Furusawa, Yoshiya

    2012-10-01

    Estimation of the survival fractions of cells irradiated with various particles over a wide linear energy transfer (LET) range is of great importance in the treatment planning of charged-particle therapy. Two computational models were developed for estimating survival fractions based on the concept of the microdosimetric kinetic model. They were designated as the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models. The former model takes into account the stochastic natures of both domain and cell nucleus specific energies, whereas the latter model represents the stochastic nature of domain specific energy by its approximated mean value and variance to reduce the computational time. The probability densities of the domain and cell nucleus specific energies are the fundamental quantities for expressing survival fractions in these models. These densities are calculated using the microdosimetric and LET-estimator functions implemented in the Particle and Heavy Ion Transport code System (PHITS) in combination with the convolution or database method. Both the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models can reproduce the measured survival fractions for high-LET and high-dose irradiations, whereas a previously proposed microdosimetric kinetic model predicts lower values for these fractions, mainly due to intrinsic ignorance of the stochastic nature of cell nucleus specific energies in the calculation. The models we developed should contribute to a better understanding of the mechanism of cell inactivation, as well as improve the accuracy of treatment planning of charged-particle therapy.

  12. System and method for detection of dispersed broadband signals

    DOEpatents

    Qian, Shie; Dunham, Mark E.

    1999-06-08

    A system and method for detecting the presence of dispersed broadband signals in real time. The present invention utilizes a bank of matched filters for detecting the received dispersed broadband signals. Each matched filter uses a respective robust time template that has been designed to approximate the dispersed broadband signals of interest, and each time template varies across a spectrum of possible dispersed broadband signal time templates. The received dispersed broadband signal x(t) is received by each of the matched filters, and if one or more matches occurs, then the received data is determined to have signal data of interest. This signal data can then be analyzed and/or transmitted to Earth for analysis, as desired. The system and method of the present invention will prove extremely useful in many fields, including satellite communications, plasma physics, and interstellar research. The varying time templates used in the bank of matched filters are determined as follows. The robust time domain template is assumed to take the form w(t)=A(t)cos{2.phi.(t)}. Since the instantaneous frequency f(t) is known to be equal to the derivative of the phase .phi.(t), the trajectory of a joint time-frequency representation of x(t) is used as an approximation of .phi.'(t).

  13. Measurement of Continuous-Energy Neutron-Incident Neutron-Production Cross Section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shigyo, Nobuhiro; Kunieda, Satoshi; Watanabe, Takehito

    Continuous energy neutron-incident neutron-production double differential cross sections were measured at the Weapons Neutron Research (WNR) facility of the Los Alamos Neutron Science Center. The energy of emitted neutrons was derived from the energy deposition in a detector. The incident-neutron energy was obtained by the time-of-flight method between the spallation target of WNR and the emitted neutron detector. Two types of detectors were adopted to measure the wide energy range of neutrons. The liquid organic scintillators covered up to 100 MeV. The recoil proton detectors that constitute the recoil proton radiator and phoswich type NaI (Tl) scintillators were used formore » neutrons above several tens of MeV. Iron and lead were used as sample materials. The experimental data were compared with the evaluated nuclear data, the results of GNASH, JQMD, and PHITS codes.« less

  14. Dose distribution of a 125 keV mean energy microplanar x-ray beam for basic studies on microbeam radiotherapy.

    PubMed

    Ohno, Yumiko; Torikoshi, Masami; Suzuki, Masao; Umetani, Keiji; Imai, Yasuhiko; Uesugi, Kentaro; Yagi, Naoto

    2008-07-01

    A multislit collimator was designed and fabricated for basic studies on microbeam radiation therapy (MRT) with an x-ray energy of about 100 keV. It consists of 30 slits that are 25 microm high, 30 mm wide, and 5 mm thick in the beam direction. The slits were made of 25 microm-thick polyimide sheets that were separated by 175 microm-thick tungsten sheets. The authors measured the dose distribution of a single microbeam with a mean energy of 125 keV by a scanning slit method using a phosphor coupled to a charge coupled device camera and found that the ratios of the dose at the center of a microbeam to that at midpositions to adjacent slits were 1050 and 760 for each side of the microbeam. This dose distribution was well reproduced by the Monte Carlo simulation code PHITS.

  15. Analysis of linear energy transfers and quality factors of charged particles produced by spontaneous fission neutrons from 252Cf and 244Pu in the human body.

    PubMed

    Endo, Akira; Sato, Tatsuhiko

    2013-04-01

    Absorbed doses, linear energy transfers (LETs) and quality factors of secondary charged particles in organs and tissues, generated via the interactions of the spontaneous fission neutrons from (252)Cf and (244)Pu within the human body, were studied using the Particle and Heavy Ion Transport Code System (PHITS) coupled with the ICRP Reference Phantom. Both the absorbed doses and the quality factors in target organs generally decrease with increasing distance from the source organ. The analysis of LET distributions of secondary charged particles led to the identification of the relationship between LET spectra and target-source organ locations. A comparison between human body-averaged mean quality factors and fluence-averaged radiation weighting factors showed that the current numerical conventions for the radiation weighting factors of neutrons, updated in ICRP103, and the quality factors for internal exposure are valid.

  16. DEVELOPMENT OF A MULTIMODAL MONTE CARLO BASED TREATMENT PLANNING SYSTEM.

    PubMed

    Kumada, Hiroaki; Takada, Kenta; Sakurai, Yoshinori; Suzuki, Minoru; Takata, Takushi; Sakurai, Hideyuki; Matsumura, Akira; Sakae, Takeji

    2017-10-26

    To establish boron neutron capture therapy (BNCT), the University of Tsukuba is developing a treatment device and peripheral devices required in BNCT, such as a treatment planning system. We are developing a new multimodal Monte Carlo based treatment planning system (developing code: Tsukuba Plan). Tsukuba Plan allows for dose estimation in proton therapy, X-ray therapy and heavy ion therapy in addition to BNCT because the system employs PHITS as the Monte Carlo dose calculation engine. Regarding BNCT, several verifications of the system are being carried out for its practical usage. The verification results demonstrate that Tsukuba Plan allows for accurate estimation of thermal neutron flux and gamma-ray dose as fundamental radiations of dosimetry in BNCT. In addition to the practical use of Tsukuba Plan in BNCT, we are investigating its application to other radiation therapies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Measurement and simulation for a complementary imaging with the neutron and X-ray beams

    NASA Astrophysics Data System (ADS)

    Hara, Kaoru Y.; Sato, Hirotaka; Kamiyama, Takashi; Shinohara, Takenao

    2017-09-01

    By using a composite source system, we measured radiographs of the thermal neutron and keV X-ray in the 45-MeV electron linear accelerator facility at Hokkaido University. The source system provides the alternative beam of neutron and X-ray by switching the production target onto the electron beam axis. In the measurement to demonstrate a complementary imaging, the detector based on a vacuum-tube type neutron color image intensifier was applied to the both beams for dual-purpose. On the other hand, for reducing background in a neutron transmission spectrum, test measurements using a gadolinium-type neutron grid were performed with a cold neutron source at Hokkaido University. In addition, the simulations of the neutron and X-ray transmissions for various substances were performed using the PHITS code. A data analysis procedure for estimating the substance of sample was investigated through the simulations.

  18. Measurements and analyses of the distribution of the radioactivity induced by the secondary neutrons produced by 17-MeV protons in compact cyclotron facility

    NASA Astrophysics Data System (ADS)

    Matsuda, Norihiro; Izumi, Yuichi; Yamanaka, Yoshiyuki; Gandou, Toshiyuki; Yamada, Masaaki; Oishi, Koji

    2017-09-01

    Measurements of reaction rates by secondary neutrons produced from beam losses by 17-MeV protons are conducted at a compact cyclotron facility with the foil activation method. The experimentally obtained distribution of the reaction rates of 197Au (n, γ) 198Au on the concrete walls suggests that a target and an electrostatic deflector as machine components for beam extraction of the compact cyclotron are principal beam loss points. The measurements are compared with calculations by the Monte Carlo code: PHITS. The calculated results based on the beam losses are good agreements with the measured ones within 21%. In this compact cyclotron facility, exponential attenuations with the distance from the electrostatic deflector in the distributions of the measured reaction rates were observed, which was looser than that by the inverse square of distance.

  19. The origin of neutron biological effectiveness as a function of energy.

    PubMed

    Baiocco, G; Barbieri, S; Babini, G; Morini, J; Alloni, D; Friedland, W; Kundrát, P; Schmitt, E; Puchalska, M; Sihver, L; Ottolenghi, A

    2016-09-22

    The understanding of the impact of radiation quality in early and late responses of biological targets to ionizing radiation exposure necessarily grounds on the results of mechanistic studies starting from physical interactions. This is particularly true when, already at the physical stage, the radiation field is mixed, as it is the case for neutron exposure. Neutron Relative Biological Effectiveness (RBE) is energy dependent, maximal for energies ~1 MeV, varying significantly among different experiments. The aim of this work is to shed light on neutron biological effectiveness as a function of field characteristics, with a comprehensive modeling approach: this brings together transport calculations of neutrons through matter (with the code PHITS) and the predictive power of the biophysical track structure code PARTRAC in terms of DNA damage evaluation. Two different energy dependent neutron RBE models are proposed: the first is phenomenological and based only on the characterization of linear energy transfer on a microscopic scale; the second is purely ab-initio and based on the induction of complex DNA damage. Results for the two models are compared and found in good qualitative agreement with current standards for radiation protection factors, which are agreed upon on the basis of RBE data.

  20. The origin of neutron biological effectiveness as a function of energy

    NASA Astrophysics Data System (ADS)

    Baiocco, G.; Barbieri, S.; Babini, G.; Morini, J.; Alloni, D.; Friedland, W.; Kundrát, P.; Schmitt, E.; Puchalska, M.; Sihver, L.; Ottolenghi, A.

    2016-09-01

    The understanding of the impact of radiation quality in early and late responses of biological targets to ionizing radiation exposure necessarily grounds on the results of mechanistic studies starting from physical interactions. This is particularly true when, already at the physical stage, the radiation field is mixed, as it is the case for neutron exposure. Neutron Relative Biological Effectiveness (RBE) is energy dependent, maximal for energies ~1 MeV, varying significantly among different experiments. The aim of this work is to shed light on neutron biological effectiveness as a function of field characteristics, with a comprehensive modeling approach: this brings together transport calculations of neutrons through matter (with the code PHITS) and the predictive power of the biophysical track structure code PARTRAC in terms of DNA damage evaluation. Two different energy dependent neutron RBE models are proposed: the first is phenomenological and based only on the characterization of linear energy transfer on a microscopic scale; the second is purely ab-initio and based on the induction of complex DNA damage. Results for the two models are compared and found in good qualitative agreement with current standards for radiation protection factors, which are agreed upon on the basis of RBE data.

  1. The origin of neutron biological effectiveness as a function of energy

    PubMed Central

    Baiocco, G.; Barbieri, S.; Babini, G.; Morini, J.; Alloni, D.; Friedland, W.; Kundrát, P.; Schmitt, E.; Puchalska, M.; Sihver, L.; Ottolenghi, A.

    2016-01-01

    The understanding of the impact of radiation quality in early and late responses of biological targets to ionizing radiation exposure necessarily grounds on the results of mechanistic studies starting from physical interactions. This is particularly true when, already at the physical stage, the radiation field is mixed, as it is the case for neutron exposure. Neutron Relative Biological Effectiveness (RBE) is energy dependent, maximal for energies ~1 MeV, varying significantly among different experiments. The aim of this work is to shed light on neutron biological effectiveness as a function of field characteristics, with a comprehensive modeling approach: this brings together transport calculations of neutrons through matter (with the code PHITS) and the predictive power of the biophysical track structure code PARTRAC in terms of DNA damage evaluation. Two different energy dependent neutron RBE models are proposed: the first is phenomenological and based only on the characterization of linear energy transfer on a microscopic scale; the second is purely ab-initio and based on the induction of complex DNA damage. Results for the two models are compared and found in good qualitative agreement with current standards for radiation protection factors, which are agreed upon on the basis of RBE data. PMID:27654349

  2. Fragmentation Cross Sections of Medium-Energy 35Cl, 40Ar, and 48TiBeams on Elemental Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, C.; Guetersloh, S.; Heilbronn, L.

    Charge-changing and fragment production cross sections at 0degrees have been obtained for interactions of 290, 400, and 650MeV/nucleon 40Ar beams, 650 and 1000 MeV/nucleon 35Cl beams, and a 1000MeV/nucleon 48Ti beam. Targets of C, CH2, Al, Cu, Sn, and Pb were used.Using standard analysis methods, we obtain fragment cross sections forcharges as low as 8 for Cl and Ar beams, and as low as 10 for the Tibeam. Using data obtained with small-acceptance detectors, we reportfragment production cross sections for charges as low as 5, corrected foracceptance using a simple model of fragment angular distributions. Withthe lower-charged fragment cross sections,more » we cancompare the data topredictions from several models (including NUCFRG2, EPAX2, and PHITS) ina region largely unexplored in earlier work. As found in earlier workwith other beams, NUCFRG2 and PHITS predictions agree reasonably wellwith the data for charge-changing cross sections, but do not accuratelypredict the fragment production cross sections. The cross sections forthe lightest fragments demonstrate the inadequacy of several models inwhich the cross sections fall monotonically with the charge of thefragment. PHITS, despite not agreeing particularly well with the fragmentproduction cross sections on average, nonetheless qualitativelyreproduces somesignificant features of the data that are missing from theother models.« less

  3. System and method for constructing filters for detecting signals whose frequency content varies with time

    DOEpatents

    Qian, Shie; Dunham, Mark E.

    1996-01-01

    A system and method for constructing a bank of filters which detect the presence of signals whose frequency content varies with time. The present invention includes a novel system and method for developing one or more time templates designed to match the received signals of interest and the bank of matched filters use the one or more time templates to detect the received signals. Each matched filter compares the received signal x(t) with a respective, unique time template that has been designed to approximate a form of the signals of interest. The robust time domain template is assumed to be of the order of w(t)=A(t)cos{2.pi..phi.(t)} and the present invention uses the trajectory of a joint time-frequency representation of x(t) as an approximation of the instantaneous frequency function {.phi.'(t). First, numerous data samples of the received signal x(t) are collected. A joint time frequency representation is then applied to represent the signal, preferably using the time frequency distribution series (also known as the Gabor spectrogram). The joint time-frequency transformation represents the analyzed signal energy at time t and frequency .function., P(t,f), which is a three-dimensional plot of time vs. frequency vs. signal energy. Then P(t,f) is reduced to a multivalued function f(t), a two dimensional plot of time vs. frequency, using a thresholding process. Curve fitting steps are then performed on the time/frequency plot, preferably using Levenberg-Marquardt curve fitting techniques, to derive a general instantaneous frequency function .phi.'(t) which best fits the multivalued function f(t), a trajectory of the joint time-frequency domain representation of x(t). Integrating .phi.'(t) along t yields .phi.(t), which is then inserted into the form of the time template equation. A suitable amplitude A(t) is also preferably determined. Once the time template has been determined, one or more filters are developed which each use a version or form of the time template.

  4. SU-C-BRC-05: Monte Carlo Calculations to Establish a Simple Relation of Backscatter Dose Enhancement Around High-Z Dental Alloy to Its Atomic Number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utsunomiya, S; Kushima, N; Katsura, K

    Purpose: To establish a simple relation of backscatter dose enhancement around a high-Z dental alloy in head and neck radiation therapy to its average atomic number based on Monte Carlo calculations. Methods: The PHITS Monte Carlo code was used to calculate dose enhancement, which is quantified by the backscatter dose factor (BSDF). The accuracy of the beam modeling with PHITS was verified by comparing with basic measured data namely PDDs and dose profiles. In the simulation, a high-Z alloy of 1 cm cube was embedded into a tough water phantom irradiated by a 6-MV (nominal) X-ray beam of 10 cmmore » × 10 cm field size of Novalis TX (Brainlab). The ten different materials of high-Z alloys (Al, Ti, Cu, Ag, Au-Pd-Ag, I, Ba, W, Au, Pb) were considered. The accuracy of calculated BSDF was verified by comparing with measured data by Gafchromic EBT3 films placed at from 0 to 10 mm away from a high-Z alloy (Au-Pd-Ag). We derived an approximate equation to determine the relation of BSDF and range of backscatter to average atomic number of high-Z alloy. Results: The calculated BSDF showed excellent agreement with measured one by Gafchromic EBT3 films at from 0 to 10 mm away from the high-Z alloy. We found the simple linear relation of BSDF and range of backscatter to average atomic number of dental alloys. The latter relation was proven by the fact that energy spectrum of backscatter electrons strongly depend on average atomic number. Conclusion: We found a simple relation of backscatter dose enhancement around high-Z alloys to its average atomic number based on Monte Carlo calculations. This work provides a simple and useful method to estimate backscatter dose enhancement from dental alloys and corresponding optimal thickness of dental spacer to prevent mucositis effectively.« less

  5. Cross section measurement of residues produced in proton- and deuteron-induced spallation reactions on 93Zr at 105 MeV/u using the inverse kinematics method

    NASA Astrophysics Data System (ADS)

    Kawase, Shoichiro; Watanabe, Yukinobu; Wang, He; Otsu, Hideaki; Sakurai, Hiroyoshi; Takeuchi, Satoshi; Togano, Yasuhiro; Nakamura, Takashi; Maeda, Yukie; Ahn, Deuk Soon; Aikawa, Masayuki; Araki, Shouhei; Chen, Sidong; Chiga, Nobuyuki; Doornenbal, Pieter; Fukuda, Naoki; Ichihara, Takashi; Isobe, Tadaaki; Kawakami, Shunsuke; Kin, Tadahiro; Kondo, Yosuke; Koyama, Shunpei; Kubo, Toshiyuki; Kubono, Shigeru; Kurokawa, Meiko; Makinaga, Ayano; Matsushita, Masafumi; Matsuzaki, Teiichiro; Michimasa, Shin'ichiro; Momiyama, Satoru; Nagamine, Shunsuke; Nakano, Keita; Niikura, Megumi; Ozaki, Tomoyuki; Saito, Atsumi; Saito, Takeshi; Shiga, Yoshiaki; Shikata, Mizuki; Shimizu, Yohei; Shimoura, Susumu; Sumikama, Toshiyuki; Söderström, Pär-Anders; Suzuki, Hiroshi; Takeda, Hiroyuki; Taniuchi, Ryo; Tsubota, Jun'ichi; Watanabe, Yasushi; Wimmer, Kathrin; Yamamoto, Tatsuya; Yoshida, Koichi

    2017-09-01

    Isotopic production cross sections in the proton- and deuteron-induced spallation reactions on 93Zr at an energy of 105 MeV/u were measured in inverse kinematics conditions for the development of realistic nuclear transmutation processes for long-lived fission products (LLFPs) with neutron and light-ion beams. The experimental results were compared to the PHITS calculations describing the intra-nuclear cascade and evaporation processes. Although an overall agreement was obtained, a large overestimation of the production cross sections for the removal of a few nucleons was seen. A clear shell effect associated with the neutron magic number N = 50 was observed in the measured isotopic production yields of Zr and Y isotopes, which can be reproduced reasonably by the PHITS calculation.

  6. New estimation method of neutron skyshine for a high-energy particle accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Jung, Nam-Suk; Lee, Hee-Seock; Ko, Seung-Kook

    2016-09-01

    A skyshine is the dominant component of the prompt radiation at off-site. Several experimental studies have been done to estimate the neutron skyshine at a few accelerator facilities. In this work, the neutron transports from a source place to off-site location were simulated using the Monte Carlo codes, FLUKA and PHITS. The transport paths were classified as skyshine, direct (transport), groundshine and multiple-shine to understand the contribution of each path and to develop a general evaluation method. The effect of each path was estimated in the view of the dose at far locations. The neutron dose was calculated using the neutron energy spectra obtained from each detector placed up to a maximum of 1 km from the accelerator. The highest altitude of the sky region in this simulation was set as 2 km from the floor of the accelerator facility. The initial model of this study was the 10 GeV electron accelerator, PAL-XFEL. Different compositions and densities of air, soil and ordinary concrete were applied in this calculation, and their dependences were reviewed. The estimation method used in this study was compared with the well-known methods suggested by Rindi, Stevenson and Stepleton, and also with the simple code, SHINE3. The results obtained using this method agreed well with those using Rindi's formula.

  7. Measurement of microdosimetric spectra with a wall-less tissue-equivalent proportional counter for a 290 MeV/u 12C beam.

    PubMed

    Tsuda, Shuichi; Sato, Tatsuhiko; Takahashi, Fumiaki; Satoh, Daiki; Endo, Akira; Sasaki, Shinichi; Namito, Yoshihito; Iwase, Hiroshi; Ban, Shuichi; Takada, Masashi

    2010-09-07

    The frequency distribution of the lineal energy, y, of a 290 MeV/u carbon beam was measured to obtain the dose-weighted mean of y and compare it with the linear energy transfer (LET). In the experiment, a wall-less tissue-equivalent proportional counter (TEPC) in a cylindrical volume with a simulated diameter of 0.72 microm was used. The measured frequency distribution of y as well as its dose-mean value agrees within 10% uncertainty with the corresponding data from microdosimetric calculations using the PHITS code. The ratio of the measured dose-mean lineal energy to the LET of the 290 MeV/u carbon beam is 0.73, which is much smaller than the corresponding data obtained by a wall TEPC. This result demonstrates that a wall-less TEPC is necessary to precisely measure the dose-mean of y for energetic heavy ion beams.

  8. Comparison between Monte Carlo simulation and measurement with a 3D polymer gel dosimeter for dose distributions in biological samples

    NASA Astrophysics Data System (ADS)

    Furuta, T.; Maeyama, T.; Ishikawa, K. L.; Fukunishi, N.; Fukasaku, K.; Takagi, S.; Noda, S.; Himeno, R.; Hayashi, S.

    2015-08-01

    In this research, we used a 135 MeV/nucleon carbon-ion beam to irradiate a biological sample composed of fresh chicken meat and bones, which was placed in front of a PAGAT gel dosimeter, and compared the measured and simulated transverse-relaxation-rate (R2) distributions in the gel dosimeter. We experimentally measured the three-dimensional R2 distribution, which records the dose induced by particles penetrating the sample, by using magnetic resonance imaging. The obtained R2 distribution reflected the heterogeneity of the biological sample. We also conducted Monte Carlo simulations using the PHITS code by reconstructing the elemental composition of the biological sample from its computed tomography images while taking into account the dependence of the gel response on the linear energy transfer. The simulation reproduced the experimental distal edge structure of the R2 distribution with an accuracy under about 2 mm, which is approximately the same as the voxel size currently used in treatment planning.

  9. Development of a calculation method for estimating specific energy distribution in complex radiation fields.

    PubMed

    Sato, Tatsuhiko; Watanabe, Ritsuko; Niita, Koji

    2006-01-01

    Estimation of the specific energy distribution in a human body exposed to complex radiation fields is of great importance in the planning of long-term space missions and heavy ion cancer therapies. With the aim of developing a tool for this estimation, the specific energy distributions in liquid water around the tracks of several HZE particles with energies up to 100 GeV n(-1) were calculated by performing track structure simulation with the Monte Carlo technique. In the simulation, the targets were assumed to be spherical sites with diameters from 1 nm to 1 microm. An analytical function to reproduce the simulation results was developed in order to predict the distributions of all kinds of heavy ions over a wide energy range. The incorporation of this function into the Particle and Heavy Ion Transport code System (PHITS) enables us to calculate the specific energy distributions in complex radiation fields in a short computational time.

  10. Dose distribution of a 125 keV mean energy microplanar x-ray beam for basic studies on microbeam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohno, Yumiko; Torikoshi, Masami; Suzuki, Masao

    A multislit collimator was designed and fabricated for basic studies on microbeam radiation therapy (MRT) with an x-ray energy of about 100 keV. It consists of 30 slits that are 25 {mu}m high, 30 mm wide, and 5 mm thick in the beam direction. The slits were made of 25 {mu}m-thick polyimide sheets that were separated by 175 {mu}m-thick tungsten sheets. The authors measured the dose distribution of a single microbeam with a mean energy of 125 keV by a scanning slit method using a phosphor coupled to a charge coupled device camera and found that the ratios of themore » dose at the center of a microbeam to that at midpositions to adjacent slits were 1050 and 760 for each side of the microbeam. This dose distribution was well reproduced by the Monte Carlo simulation code PHITS.« less

  11. Comparison between Monte Carlo simulation and measurement with a 3D polymer gel dosimeter for dose distributions in biological samples.

    PubMed

    Furuta, T; Maeyama, T; Ishikawa, K L; Fukunishi, N; Fukasaku, K; Takagi, S; Noda, S; Himeno, R; Hayashi, S

    2015-08-21

    In this research, we used a 135 MeV/nucleon carbon-ion beam to irradiate a biological sample composed of fresh chicken meat and bones, which was placed in front of a PAGAT gel dosimeter, and compared the measured and simulated transverse-relaxation-rate (R2) distributions in the gel dosimeter. We experimentally measured the three-dimensional R2 distribution, which records the dose induced by particles penetrating the sample, by using magnetic resonance imaging. The obtained R2 distribution reflected the heterogeneity of the biological sample. We also conducted Monte Carlo simulations using the PHITS code by reconstructing the elemental composition of the biological sample from its computed tomography images while taking into account the dependence of the gel response on the linear energy transfer. The simulation reproduced the experimental distal edge structure of the R2 distribution with an accuracy under about 2 mm, which is approximately the same as the voxel size currently used in treatment planning.

  12. Measurement of the neutron angular distribution from a beryllium target bombarded with a 345-MeV/u 238U beam at the RIKEN RI beam factory

    NASA Astrophysics Data System (ADS)

    Nakao, Noriaki; Uwamino, Yoshitomo; Tanaka, Kanenobu

    2018-05-01

    The angular distribution of neutrons produced from a 4-mm-thick beryllium target bombarded with a 345-MeV/u 238U beam was measured outside the target chamber using bismuth and aluminum activation detectors at angles of 4.5°, 10°, 30°, 60°, 70° and 90° from the beam axis. Following two hours of irradiation and photo-peak analyses, the production rates of the radionuclides were obtained for the 209Bi(n,xn)210-xBi(x = 4-12) and 27Al(n,α)24Na reactions. Using the Particle and Heavy Ion Transport code System (PHITS), a Monte Carlo simulation of the production rates was performed and the ratios of the calculated to the experimental results (C/E) ranged from 0.6 to 1.0 generally and 0.4 to 1.3 in worst cases.

  13. Strengthening integrated primary health care in Sofala, Mozambique

    PubMed Central

    2013-01-01

    Background Large increases in health sector investment and policies favoring upgrading and expanding the public sector health network have prioritized maternal and child health in Mozambique and, over the past decade, Mozambique has achieved substantial improvements in maternal and child health indicators. Over this same period, the government of Mozambique has continued to decentralize the management of public sector resources to the district level, including in the health sector, with the aim of bringing decision-making and resources closer to service beneficiaries. Weak district level management capacity has hindered the decentralization process, and building this capacity is an important link to ensure that resources translate to improved service delivery and further improvements in population health. A consortium of the Ministry of Health, Health Alliance International, Eduardo Mondlane University, and the University of Washington are implementing a health systems strengthening model in Sofala Province, central Mozambique. Description of implementation The Mozambique Population Health Implementation and Training (PHIT) Partnership focuses on improving the quality of routine data and its use through appropriate tools to facilitate decision making by health system managers; strengthening management and planning capacity and funding district health plans; and building capacity for operations research to guide system-strengthening efforts. This seven-year effort covers all 13 districts and 146 health facilities in Sofala Province. Evaluation design A quasi-experimental controlled time-series design will be used to assess the overall impact of the partnership strategy on under-5 mortality by examining changes in mortality pre- and post-implementation in Sofala Province compared with neighboring Manica Province. The evaluation will compare a broad range of input, process, output, and outcome variables to strengthen the plausibility that the partnership strategy led to health system improvements and subsequent population health impact. Discussion The Mozambique PHIT Partnership expects to provide evidence on the effect of efforts to improve data quality coupled with the introduction of tools, training, and supervision to improve evidence-based decision making. This contribution to the knowledge base on what works to enhance health systems is highly replicable for rapid scale-up to other provinces in Mozambique, as well as other sub-Saharan African countries with limited resources and a commitment to comprehensive primary health care. PMID:23819552

  14. Fragmentation cross sections of medium-energy {sup 35}Cl, {sup 40}Ar, and {sup 48}Ti beams on elemental targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, C.; Guetersloh, S.; Heilbronn, L.

    Charge-changing and fragment production cross sections at 0 deg. have been obtained for interactions of 290, 400, and 650 MeV/nucleon {sup 40}Ar beams, 650 and 1000 MeV/nucleon {sup 35}Cl beams, and a 1000 MeV/nucleon {sup 48}Ti beam. Targets of C, CH{sub 2}, Al, Cu, Sn, and Pb were used. Using standard analysis methods, we obtained fragment cross sections for charges as low as 8 for Cl and Ar beams and as low as 10 for the Ti beam. Using data obtained with small-acceptance detectors, we report fragment production cross sections for charges as low as 5, corrected for acceptance usingmore » a simple model of fragment angular distributions. With the lower-charged fragment cross sections, we can compare the data to predictions from several models (including NUCFRG2, EPAX2, and PHITS) in a region largely unexplored in earlier work. As found in earlier work with other beams, NUCFRG2 and PHITS predictions agree reasonably well with the data for charge-changing cross sections, but these models do not accurately predict the fragment production cross sections. The cross sections for the lightest fragments demonstrate the inadequacy of several models in which the cross sections fall monotonically with the charge of the fragment. PHITS, despite its not agreeing particularly well with the fragment production cross sections on average, nonetheless qualitatively reproduces some significant features of the data that are missing from the other models.« less

  15. Monte Carlo Simulations Comparing the Response of a Novel Hemispherical Tepc to Existing Spherical and Cylindrical Tepcs for Neutron Monitoring and Dosimetry.

    PubMed

    Broughton, David P; Waker, Anthony J

    2017-05-01

    Neutron dosimetry in reactor fields is currently mainly conducted with unwieldy flux monitors. Tissue Equivalent Proportional Counters (TEPCs) have been shown to have the potential to improve the accuracy of neutron dosimetry in these fields, and Multi-Element Tissue Equivalent Proportional Counters (METEPCs) could reduce the size of instrumentation required to do so. Complexity of current METEPC designs has inhibited their use beyond research. This work proposes a novel hemispherical counter with a wireless anode ball in place of the traditional anode wire as a possible solution for simplifying manufacturing. The hemispherical METEPC element was analyzed as a single TEPC to first demonstrate the potential of this new design by evaluating its performance relative to the reference spherical TEPC design and a single element from a cylindrical METEPC. Energy deposition simulations were conducted using the Monte Carlo code PHITS for both monoenergetic 2.5 MeV neutrons and the neutron energy spectrum of Cf-D2O moderated. In these neutron fields, the hemispherical counter appears to be a good alternative to the reference spherical geometry, performing slightly better than the cylindrical counter, which tends to underrespond to H*(10) for the lower neutron energies of the Cf-D2O moderated field. These computational results are promising, and if follow-up experimental work demonstrates the hemispherical counter works as anticipated, it will be ready to be incorporated into an METEPC design.

  16. Health system strengthening: a qualitative evaluation of implementation experience and lessons learned across five African countries.

    PubMed

    Rwabukwisi, Felix Cyamatare; Bawah, Ayaga A; Gimbel, Sarah; Phillips, James F; Mutale, Wilbroad; Drobac, Peter

    2017-12-21

    Achieving the United Nations Sustainable Development Goals in sub-Saharan Africa will require substantial improvements in the coverage and performance of primary health care delivery systems. Projects supported by the Doris Duke Charitable Foundation's (DDCF) African Health Initiative (AHI) created public-private-academic and community partnerships in five African countries to implement and evaluate district-level health system strengthening interventions. In this study, we captured common implementation experiences and lessons learned to understand core elements of successful health systems interventions. We used qualitative data from key informant interviews and annual progress reports from the five Population Health Implementation and Training (PHIT) partnership projects funded through AHI in Ghana, Mozambique, Rwanda, Tanzania, and Zambia. Four major overarching lessons were highlighted. First, variety and inclusiveness of concerned key players (public, academic and private) are necessary to address complex health system issues at all levels. Second, a learning culture that promotes evidence creation and ability to efficiently adapt were key in order to meet changing contextual needs. Third, inclusion of strong implementation science tools and strategies allowed informed and measured learning processes and efficient dissemination of best practices. Fourth, five to seven years was the minimum time frame necessary to effectively implement complex health system strengthening interventions and generate the evidence base needed to advocate for sustainable change for the PHIT partnership projects. The AHI experience has raised remaining, if not overlooked, challenges and potential solutions to address complex health systems strengthening intervention designs and implementation issues, while aiming to measurably accomplish sustainable positive change in dynamic, learning, and varied contexts.

  17. Method for the prediction of the effective dose equivalent to the crew of the International Space Station

    NASA Astrophysics Data System (ADS)

    El-Jaby, Samy; Tomi, Leena; Sihver, Lembit; Sato, Tatsuhiko; Richardson, Richard B.; Lewis, Brent J.

    2014-03-01

    This paper describes a methodology for assessing the pre-mission exposure of space crew aboard the International Space Station (ISS) in terms of an effective dose equivalent. In this approach, the PHITS Monte Carlo code was used to assess the particle transport of galactic cosmic radiation (GCR) and trapped radiation for solar maximum and minimum conditions through an aluminum shield thickness. From these predicted spectra, and using fluence-to-dose conversion factors, a scaling ratio of the effective dose equivalent rate to the ICRU ambient dose equivalent rate at a 10 mm depth was determined. Only contributions from secondary neutrons, protons, and alpha particles were considered in this analysis. Measurements made with a tissue equivalent proportional counter (TEPC) located at Service Module panel 327, as captured through a semi-empirical correlation in the ISSCREM code, where then scaled using this conversion factor for prediction of the effective dose equivalent. This analysis shows that at this location within the service module, the total effective dose equivalent is 10-30% less than the total TEPC dose equivalent. Approximately 75-85% of the effective dose equivalent is derived from the GCR. This methodology provides an opportunity for pre-flight predictions of the effective dose equivalent and therefore offers a means to assess the health risks of radiation exposure on ISS flight crew.

  18. Fragmentation of {sup 14}N, {sup 16}O, {sup 20}Ne, and {sup 24}Mg nuclei at 290 to 1000 MeV/nucleon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, C.; Miller, J.; Guetersloh, S.

    We report fragmentation cross sections measured at 0 deg. for beams of {sup 14}N, {sup 16}O, {sup 20}Ne, and {sup 24}Mg ions, at energies ranging from 290 MeV/nucleon to 1000 MeV/nucleon. Beams were incident on targets of C, CH{sub 2}, Al, Cu, Sn, and Pb, with the C and CH{sub 2} target data used to obtain hydrogen-target cross sections. Using methods established in earlier work, cross sections obtained with both large-acceptance and small-acceptance detectors are extracted from the data and, when necessary, corrected for acceptance effects. The large-acceptance data yield cross sections for fragments with charges approximately half of themore » beam charge and above, with minimal corrections. Cross sections for lighter fragments are obtained from small-acceptance spectra, with more significant, model-dependent corrections that account for the fragment angular distributions. Results for both charge-changing and fragment production cross sections are compared to the predictions of the Los Alamos version of the quark gluon string model (LAQGSM) as well as the NASA Nuclear Fragmentation (NUCFRG2) model and the Particle and Heavy Ion Transport System (PHITS) model. For all beams and targets, cross sections for fragments as light as He are compared to the models. Estimates of multiplicity-weighted helium production cross sections are obtained from the data and compared to PHITS and LAQGSM predictions. Summary statistics show that the level of agreement between data and predictions is slightly better for PHITS than for either NUCFRG2 or LAQGSM.« less

  19. Calculation of dose distribution above contaminated soil

    NASA Astrophysics Data System (ADS)

    Kuroda, Junya; Tenzou, Hideki; Manabe, Seiya; Iwakura, Yukiko

    2017-07-01

    The purpose of this study was to assess the relationship between altitude and the distribution of the ambient dose rate in the air over soil decontamination area by using PHITS simulation code. The geometry configuration was 1000 m ×1000 m area and 1m in soil depth and 100m in altitude from the ground to simulate the area of residences or a school grounds. The contaminated region is supposed to be uniformly contaminated by Cs-137 γ radiation sources. The air dose distribution and space resolution was evaluated for flux of the gamma rays at each altitude, 1, 5, 10, and 20m. The effect of decontamination was calculated by defining sharpness S. S was the ratio of an average flux and a flux at the center of denomination area in each altitude. The suitable flight altitude of the drone is found to be less than 15m above a residence and 31m above a school grounds to confirm the decontamination effect. The calculation results can be a help to determine a flight planning of a drone to minimize the clash risk.

  20. Systematic measurement of lineal energy distributions for proton, He and Si ion beams over a wide energy range using a wall-less tissue equivalent proportional counter.

    PubMed

    Tsuda, Shuichi; Sato, Tatsuhiko; Takahashi, Fumiaki; Satoh, Daiki; Sasaki, Shinichi; Namito, Yoshihito; Iwase, Hiroshi; Ban, Shuichi; Takada, Masashi

    2012-01-01

    The frequency distributions of the lineal energy, y, of 160 MeV proton, 150 MeV/u helium, and 490 MeV/u silicon ion beams were measured using a wall-less tissue equivalent proportional counter (TEPC) with a site size of 0.72 µm. The measured frequency distributions of y as well as the dose-mean values, y(D), agree with the corresponding data calculated using the microdosimetric function of the particle and heavy ion transport code system PHITS. The values of y(D) increase in the range of LET below ~10 keV µm(-1) because of discrete energy deposition by delta rays, while the relation is reversed above ~10 keV µm(-1) as the amount of energy escaping via delta rays increases. These results indicate that care should be taken with the difference between y(D) and LET when estimating the ionization density that usually relates to relative biological effectiveness (RBE) of energetic heavy ions.

  1. Reaction mechanism interplay in determining the biological effectiveness of neutrons as a function of energy.

    PubMed

    Baiocco, G; Alloni, D; Babini, G; Mariotti, L; Ottolenghi, A

    2015-09-01

    Neutron relative biological effectiveness (RBE) is found to be energy dependent, being maximal for energies ∼1 MeV. This is reflected in the choice of radiation weighting factors wR for radiation protection purposes. In order to trace back the physical origin of this behaviour, a detailed study of energy deposition processes with their full dependences is necessary. In this work, the Monte Carlo transport code PHITS was used to characterise main secondary products responsible for energy deposition in a 'human-sized' soft tissue spherical phantom, irradiated by monoenergetic neutrons with energies around the maximal RBE/wR. Thereafter, results on the microdosimetric characterisation of secondary protons were used as an input to track structure calculations performed with PARTRAC, thus evaluating the corresponding DNA damage induction. Within the proposed simplified approach, evidence is suggested for a relevant role of secondary protons in inducing the maximal biological effectiveness for 1 MeV neutrons. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Management of cosmic radiation exposure for aircraft crew in Japan.

    PubMed

    Yasuda, Hiroshi; Sato, Tatsuhiko; Yonehara, Hidenori; Kosako, Toshiso; Fujitaka, Kazunobu; Sasaki, Yasuhito

    2011-07-01

    The International Commission on Radiological Protection has recommended that cosmic radiation exposure of crew in commercial jet aircraft be considered as occupational exposure. In Japan, the Radiation Council of the government has established a guideline that requests domestic airlines to voluntarily keep the effective dose of cosmic radiation for aircraft crew below 5 mSv y(-1). The guideline also gives some advice and policies regarding the method of cosmic radiation dosimetry, the necessity of explanation and education about this issue, a way to view and record dose data, and the necessity of medical examination for crew. The National Institute of Radiological Sciences helps the airlines to follow the guideline, particularly for the determination of aviation route doses by numerical simulation. The calculation is performed using an original, easy-to-use program package called 'JISCARD EX' coupled with a PHITS-based analytical model and a GEANT4-based particle tracing code. The new radiation weighting factors recommended in 2007 are employed for effective dose determination. The annual individual doses of aircraft crew were estimated using this program.

  3. Depth profile of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions

    NASA Astrophysics Data System (ADS)

    Mokhtari Oranj, Leila; Jung, Nam-Suk; Kim, Dong-Hyun; Lee, Arim; Bae, Oryun; Lee, Hee-Seock

    2016-11-01

    Experimental and simulation studies on the depth profiles of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions were carried out. Irradiation experiments were performed at the high-intensity proton linac facility (KOMAC) in Korea. The targets, irradiated by 100-MeV protons, were arranged in a stack consisting of natural Pb, Al, Au foils and Pb plates. The proton beam intensity was determined by activation analysis method using 27Al(p, 3p1n)24Na, 197Au(p, p1n)196Au, and 197Au(p, p3n)194Au monitor reactions and also by Gafchromic film dosimetry method. The yields of produced radio-nuclei in the natPb activation foils and monitor foils were measured by HPGe spectroscopy system. Monte Carlo simulations were performed by FLUKA, PHITS/DCHAIN-SP, and MCNPX/FISPACT codes and the calculated data were compared with the experimental results. A satisfactory agreement was observed between the present experimental data and the simulations.

  4. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    NASA Astrophysics Data System (ADS)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  5. Estimation of the influence of radical effect in the proton beams using a combined approach with physical data and gel data

    NASA Astrophysics Data System (ADS)

    Haneda, K.

    2016-04-01

    The purpose of this study was to estimate an impact on radical effect in the proton beams using a combined approach with physical data and gel data. The study used two dosimeters: ionization chambers and polymer gel dosimeters. Polymer gel dosimeters have specific advantages when compared to other dosimeters. They can measure chemical reaction and they are at the same time a phantom that can map in three dimensions continuously and easily. First, a depth-dose curve for a 210 MeV proton beam measured using an ionization chamber and a gel dosimeter. Second, the spatial distribution of the physical dose was calculated by Monte Carlo code system PHITS: To verify of the accuracy of Monte Carlo calculation, and the calculation results were compared with experimental data of the ionization chamber. Last, to evaluate of the rate of the radical effect against the physical dose. The simulation results were compared with the measured depth-dose distribution and showed good agreement. The spatial distribution of a gel dose with threshold LET value of proton beam was calculated by the same simulation code. Then, the relative distribution of the radical effect was calculated from the physical dose and gel dose. The relative distribution of the radical effect was calculated at each depth as the quotient of relative dose obtained using physical and gel dose. The agreement between the relative distributions of the gel dosimeter and Radical effect was good at the proton beams.

  6. Measurement of the stochastic radial dose distribution for a 30-MeV proton beam using a wall-less tissue-equivalent proportional counter

    PubMed Central

    Tsuda, S.; Sato, T.; Ogawa, T.

    2016-01-01

    The frequency distribution of the lineal energy, y, of a 30-MeV proton beam was measured as a function of the radial distance from the beam path, and the dosed mean of y,y¯D, was obtained to investigate the radial dependence of y¯D. A wall-less tissue-equivalent proportional counter, in a cylindrical volume with simulated diameters of 0.36, 0.72 and 1.44 µm was used for the measurement of y distributions, yf(y). The measured values of yf(y) summed in the radial direction agreed fairly well with the corresponding data taken from the microdosimetric calculations using the PHITS code. The y¯D value of the 30-MeV proton beam presented its smallest value at r = 0.0 and gradually increased with radial distance, and the y¯D values of heavy ions such as iron showed rapid decrease with radial distance. This experimental result demonstrated that the stochastic deposited energy distribution of high-energy protons in the microscopic region is rather constant in the core as well as in the penumbra region of the track structure. PMID:25956785

  7. Microdosimetric Modeling of Biological Effectiveness for Boron Neutron Capture Therapy Considering Intra- and Intercellular Heterogeneity in 10B Distribution.

    PubMed

    Sato, Tatsuhiko; Masunaga, Shin-Ichiro; Kumada, Hiroaki; Hamada, Nobuyuki

    2018-01-17

    We here propose a new model for estimating the biological effectiveness for boron neutron capture therapy (BNCT) considering intra- and intercellular heterogeneity in 10 B distribution. The new model was developed from our previously established stochastic microdosimetric kinetic model that determines the surviving fraction of cells irradiated with any radiations. In the model, the probability density of the absorbed doses in microscopic scales is the fundamental physical index for characterizing the radiation fields. A new computational method was established to determine the probability density for application to BNCT using the Particle and Heavy Ion Transport code System PHITS. The parameters used in the model were determined from the measured surviving fraction of tumor cells administrated with two kinds of 10 B compounds. The model quantitatively highlighted the indispensable need to consider the synergetic effect and the dose dependence of the biological effectiveness in the estimate of the therapeutic effect of BNCT. The model can predict the biological effectiveness of newly developed 10 B compounds based on their intra- and intercellular distributions, and thus, it can play important roles not only in treatment planning but also in drug discovery research for future BNCT.

  8. Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.

    2016-03-01

    The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.

  9. Secondary Neutron Doses to Pediatric Patients During Intracranial Proton Therapy: Monte Carlo Simulation of the Neutron Energy Spectrum and its Organ Doses.

    PubMed

    Matsumoto, Shinnosuke; Koba, Yusuke; Kohno, Ryosuke; Lee, Choonsik; Bolch, Wesley E; Kai, Michiaki

    2016-04-01

    Proton therapy has the physical advantage of a Bragg peak that can provide a better dose distribution than conventional x-ray therapy. However, radiation exposure of normal tissues cannot be ignored because it is likely to increase the risk of secondary cancer. Evaluating secondary neutrons generated by the interaction of the proton beam with the treatment beam-line structure is necessary; thus, performing the optimization of radiation protection in proton therapy is required. In this research, the organ dose and energy spectrum were calculated from secondary neutrons using Monte Carlo simulations. The Monte Carlo code known as the Particle and Heavy Ion Transport code System (PHITS) was used to simulate the transport proton and its interaction with the treatment beam-line structure that modeled the double scattering body of the treatment nozzle at the National Cancer Center Hospital East. The doses of the organs in a hybrid computational phantom simulating a 5-y-old boy were calculated. In general, secondary neutron doses were found to decrease with increasing distance to the treatment field. Secondary neutron energy spectra were characterized by incident neutrons with three energy peaks: 1×10, 1, and 100 MeV. A block collimator and a patient collimator contributed significantly to organ doses. In particular, the secondary neutrons from the patient collimator were 30 times higher than those from the first scatter. These results suggested that proactive protection will be required in the design of the treatment beam-line structures and that organ doses from secondary neutrons may be able to be reduced.

  10. Nuclear fragmentation of GCR-like ions: comparisons between data and PHITS

    NASA Astrophysics Data System (ADS)

    Zeitlin, Cary; Guetersloh, Stephen; Heilbronn, Lawrence; Miller, Jack; Sihver, Lembit; Mancusi, Davide; Fukumura, Aki; Iwata, Yoshi; Murakami, Takeshi

    We present a summary of results from recent work in which we have compared nuclear fragmentation cross section data to predictions of the PHITS Monte Carlo simulation. The studies used beams of 12 C, 35 Cl, 40 Ar, 48 Ti, and 56 Fe at energies ranging from 290 MeV/nucleon to 1000 MeV/nucleon. Some of the data were obtained at the Brookhaven National Laboratory, others at the National Institute of Radiological Sciences in Japan. These energies and ion species are representative of the heavy ion component of the Galactic Cosmic Rays (GCR), which contribute significantly to the dose and dose equivalent that will be received by astronauts on deep-space missions. A critical need for NASA is the ability to accurately model the transport of GCR heavy ions through matter, including spacecraft walls, equipment racks, and other shielding materials, as well as through tissue. Nuclear interaction cross sections are of primary importance in the GCR transport problem. These interactions generally cause the incoming ion to break up (fragment) into one or more lighter ions, which continue approximately along the initial trajectory and with approximately the same velocity the incoming ion had prior to the interaction. Since the radiation dose delivered by a particle is proportional to the square of the quantity (charge/velocity), i.e., to (Z/β)2 , fragmentation reduces the dose (and, typically, dose equivalent) delivered by incident ions. The other mechanism by which dose can be reduced is ionization energy loss, which can lead to some particles stopping in the shielding. This is the conventional notion of shielding, but it is not applicable to human spaceflight, since the particles in the GCR tend to be highly energetic and because shielding must be relatively thin in order to keep overall mass as low as possible, keeping launch costs within reason. To support these goals, our group has systematically measured a large number of nuclear cross sections, intended to be used as either input to, or validation of, NASA transport models. A database containing over 200 charge-changing cross sections, and over 2000 fragment production cross sections, is nearing completion, with most results available online. In the past year, we have been investigating the PHITS (Particle and Heavy Ion Transport System) model of Niita et al. For purposes of modeling nuclear interactions, PHITS combines the Jet AA Microscopic Transport Model (JAM) hadron cascade model, the Jaeri Quantum Molecular Dynamics (JQMD) model, and the Generalized Evaporation Model (GEM). We will present detailed comparisons of our data to the cross sections and fragment angular distributions that arise from this model. The model contains some significant deficiencies, but, as we will show, also represents a significant advance over older, simpler models of fragmentation. 504b030414000600080000002100828abc13fa0000001c020000130000005b436f6e74656e745f54797065735d2e78

  11. Path Toward a Unified Geometry for Radiation Transport

    NASA Astrophysics Data System (ADS)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.

  12. Environmental Acoustic Considerations for Passive Detection of Maritime Targets by Hydrophones in a Deep Ocean Trench

    DTIC Science & Technology

    2010-06-01

    Science and Technology. Available: http://cmst.curtin.edu.au/local/docs/ products / actup_v2_2l_installation_user_guide.pdf (accessed 2 June 2010...noisecurve112(:,6)); %% Intergrating Noise Level Trench A n2=0; Itot=0; phi_t=atan(D1/L1); m=1; while (phi(m,1)>phi_t) m=m+1; end

  13. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2014-04-01

    of this project is to help prevent psychological disorders in high-risk individuals with early symptoms of stress, depression , substance use, and...questionnaires in five domains (i.e., stress, anxiety, sleep quality, depression , and alcohol use). An expert system, called the intelligent virtual...problems mentioned were depression , anxiety and sleep issues. Additional post deployment health problems discussed include stress, aggression, social

  14. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2016-06-01

    smartphone or tablet computer platforms, including both Google Android™ and Apple iOS based devices. Recruiting for the pilot study was very...framework design.. 15. SUBJECT TERMS PTSD, post-traumatic stress disorder, mobile health, self-help, iOS , Android, mindfulness, relaxation... study and subsequent randomized controlled trial (RCT) with post-deployed personnel; and (5) adapting the developed system for several popular

  15. Multistep Lattice-Voxel method utilizing lattice function for Monte-Carlo treatment planning with pixel based voxel model.

    PubMed

    Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K

    2011-12-01

    Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Measurement of the stochastic radial dose distribution for a 30-MeV proton beam using a wall-less tissue-equivalent proportional counter.

    PubMed

    Tsuda, S; Sato, T; Ogawa, T

    2016-02-01

    The frequency distribution of the lineal energy, y, of a 30-MeV proton beam was measured as a function of the radial distance from the beam path, and the dosed mean of y, y¯(D), was obtained to investigate the radial dependence of y¯(D). A wall-less tissue-equivalent proportional counter, in a cylindrical volume with simulated diameters of 0.36, 0.72 and 1.44 µm was used for the measurement of y distributions, yf(y). The measured values of yf(y) summed in the radial direction agreed fairly well with the corresponding data taken from the microdosimetric calculations using the PHITS code. The y¯(D) value of the 30-MeV proton beam presented its smallest value at r = 0.0 and gradually increased with radial distance, and the y¯(D) values of heavy ions such as iron showed rapid decrease with radial distance. This experimental result demonstrated that the stochastic deposited energy distribution of high-energy protons in the microscopic region is rather constant in the core as well as in the penumbra region of the track structure. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Differential cross-sections measurements for hadrontherapy: 50 MeV/A 12C reactions on H, C, O, Al and natTi targets

    NASA Astrophysics Data System (ADS)

    Divay, C.; Colin, J.; Cussol, D.; Finck, Ch.; Karakaya, Y.; Labalme, M.; Rousseau, M.; Salvador, S.; Vanstalle, M.

    2017-09-01

    In order to keep the benefits of a carbon treatment, the dose and biological effects induced by secondary fragments must be taken into account when simulating the treatment plan. These Monte-Carlo simulations codes are done using nuclear models that are constrained by experimental data. It is hence necessary to have precise measurements of the production rates of these fragments all along the beam path and for its whole energy range. In this context, a series of experiments aiming to measure the double differential fragmentation cross-sections of carbon on thin targets of medical interest has been started by our collaboration. In March 2015, an experiment was performed with a 50 MeV/nucleon 12C beam at GANIL. During this experiment, energy and angular differential cross-section distributions on H, C, O, Al and natTi have been measured. In the following, the experimental set-up and analysis process are briefly described and some experimental results are presented. Comparisons between several exit channel models from Phits and Geant4 show great discrepancies with the experimental data. Finally, the homemade Sliipie model is briefly presented and preliminary results are compared to the data with a promising outcome.

  18. Measurement And Calculation of High-Energy Neutron Spectra Behind Shielding at the CERF 120-GeV/C Hadron Beam Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakao, N.; /SLAC; Taniguchi, S.

    Neutron energy spectra were measured behind the lateral shield of the CERF (CERN-EU High Energy Reference Field) facility at CERN with a 120 GeV/c positive hadron beam (a mixture of mainly protons and pions) on a cylindrical copper target (7-cm diameter by 50-cm long). An NE213 organic liquid scintillator (12.7-cm diameter by 12.7-cm long) was located at various longitudinal positions behind shields of 80- and 160-cm thick concrete and 40-cm thick iron. The measurement locations cover an angular range with respect to the beam axis between 13 and 133{sup o}. Neutron energy spectra in the energy range between 32 MeVmore » and 380 MeV were obtained by unfolding the measured pulse height spectra with the detector response functions which have been verified in the neutron energy range up to 380 MeV in separate experiments. Since the source term and experimental geometry in this experiment are well characterized and simple and results are given in the form of energy spectra, these experimental results are very useful as benchmark data to check the accuracies of simulation codes and nuclear data. Monte Carlo simulations of the experimental set up were performed with the FLUKA, MARS and PHITS codes. Simulated spectra for the 80-cm thick concrete often agree within the experimental uncertainties. On the other hand, for the 160-cm thick concrete and iron shield differences are generally larger than the experimental uncertainties, yet within a factor of 2. Based on source term simulations, observed discrepancies among simulations of spectra outside the shield can be partially explained by differences in the high-energy hadron production in the copper target.« less

  19. Rational evaluation of the therapeutic effect and dosimetry of auger electrons for radionuclide therapy in a cell culture model.

    PubMed

    Shinohara, Ayaka; Hanaoka, Hirofumi; Sakashita, Tetsuya; Sato, Tatsuhiko; Yamaguchi, Aiko; Ishioka, Noriko S; Tsushima, Yoshito

    2018-02-01

    Radionuclide therapy with low-energy auger electron emitters may provide high antitumor efficacy while keeping the toxicity to normal organs low. Here we evaluated the usefulness of an auger electron emitter and compared it with that of a beta emitter for tumor treatment in in vitro models and conducted a dosimetry simulation using radioiodine-labeled metaiodobenzylguanidine (MIBG) as a model compound. We evaluated the cellular uptake of 125 I-MIBG and the therapeutic effects of 125 I- and 131 I-MIBG in 2D and 3D PC-12 cell culture models. We used a Monte Carlo simulation code (PHITS) to calculate the absorbed radiation dose of 125 I or 131 I in computer simulation models for 2D and 3D cell cultures. In the dosimetry calculation for the 3D model, several distribution patterns of radionuclide were applied. A higher cumulative dose was observed in the 3D model due to the prolonged retention of MIBG compared to the 2D model. However, 125 I-MIBG showed a greater therapeutic effect in the 2D model compared to the 3D model (respective EC 50 values in the 2D and 3D models: 86.9 and 303.9 MBq/cell), whereas 131 I-MIBG showed the opposite result (respective EC 50 values in the 2D and 3D models: 49.4 and 30.2 MBq/cell). The therapeutic effect of 125 I-MIBG was lower than that of 131 I-MIBG in both models, but the radionuclide-derived difference was smaller in the 2D model. The dosimetry simulation with PHITS revealed the influence of the radiation quality, the crossfire effect, radionuclide distribution, and tumor shape on the absorbed dose. Application of the heterogeneous distribution series dramatically changed the radiation dose distribution of 125 I-MIBG, and mitigated the difference between the estimated and measured therapeutic effects of 125 I-MIBG. The therapeutic effect of 125 I-MIBG was comparable to that of 131 I-MIBG in the 2D model, but the efficacy was inferior to that of 131 I-MIBG in the 3D model, since the crossfire effect is negligible and the homogeneous distribution of radionuclides was insufficient. Thus, auger electrons would be suitable for treating small-sized tumors. The design of radiopharmaceuticals with auger electron emitters requires particularly careful consideration of achieving a homogeneous distribution of the compound in the tumor.

  20. The Salt-Gradient Solar Pond.

    DTIC Science & Technology

    1983-02-01

    X rGP)**2H1*(DEFH+X1 UP)(i... X CP( *(E-’PfH+XTGP]/lo) 10851) y 1000)i(IC tZE*X~GT*0.) 109001 11110 D( I )= (,,+* IELT *PtH1 ( 2) 2/ .o+4LX ) ( I.’(Q...2.*LLX**2)( I.-(0*DRbT)/(DELX 12100 _. D *42)) +PhI()+((* iELT )/(2.*(.ELA*42) ))PHI(T-1)+(EELT/2.. -1 2 3 0 9 ... . . .... ---(1 I

  1. Electromagnetic Radiation in the Atmosphere Generated by Excess Negative Charge in a Nuclear-Electromagnetic Cascade

    NASA Astrophysics Data System (ADS)

    Malyshevsky, V. S.; Fomin, G. V.

    2017-01-01

    On the basis of the analytical model "PARMA" (PHITS-based Analytical Radiation Model in the Atmosphere), developed to model particle fluxes of secondary cosmic radiation in the Earth's atmosphere, we have calculated the characteristics of radio waves emitted by excess negative charge in an electromagnetic cascade. The results may be of use in an analysis of experimental data on radio emission of electron-photon showers in the atmosphere.

  2. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2012-04-01

    heart rate (HR), heart rate variability ( HRV ), and body motion and transmit data to the smartphone via Bluetooth wireless. The planned suite of...behaviors (e.g., alcohol use, exercise) are combined with objective measures (e.g., HRV arousal measures) to form an overall health status assessment...of primary health domains (PTSD, depression, anxiety , stress, alcohol use). Scheduled instrument and intervention tasks will be listed on the

  3. Evaluation of HRV Biofeedback as a Resilience Building Reserve Component

    DTIC Science & Technology

    2017-08-01

    Inventions, patent applications, and/or licenses Nothing to report  Other Products  Study recruitment/announcement video  Mobile app for...International Award Amount: $1,833,144 Study / Product Aim(s) 1. Develop and test the PHIT platform for use with the BART protocol. 2. Examine...STATEMENT Approved for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The specific aims of this study are to (1) develop a

  4. Path Toward a Unifid Geometry for Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann

    2014-01-01

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats

  5. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2013-04-01

    Findings Post deployment health problems. The top three post deployment health problems mentioned were depression, anxiety and sleep issues... anxiety ) are immediately available to the iVA, which is able to determine how to proceed with the user. The iVA may choose to schedule a screening... anxiety , sleep quality, depression, and alcohol use). For each domain, the screening data are analyzed by the iVA and a subsequent detailed assessment

  6. An Augmented γ-Spray System to Visualize Biological Effects for Human Body

    NASA Astrophysics Data System (ADS)

    Manabe, Seiya; Tenzou, Hideki; Kasuga, Takaaki; Iwakura, Yukiko; Johnston, Robert

    2017-09-01

    The purpose of this study was to develop a new educational system with an easy-to-use interface in order to support comprehension of the biological effects of radiation on the human body within a short period of time. A paint spray-gun was used as a gamma rays source mock-up for the system. The application screen shows the figure of a human body for radiation deposition using the γ-Sprayer, a virtual radiation source, as well as equivalent dosage and a panel for setting the irradiation conditions. While the learner stands in front of the PC monitor, the virtual radiation source is used to deposit radiation on the graphic of the human body that is displayed. Tissue damage is calculated using an interpolation method from the data calculated by the PHITS simulation code in advance while the learner is pulling the trigger with respect to the irradiation time, incident position, and distance from the screen. It was confirmed that the damage was well represented by the interpolation method. The augmented ?-Spray system was assessed by questionnaire. Pre-post questionnaire was taken for our 41 students in National Institute of Technology, Kagawa College. It was also confirmed that the system has a capability of teaching the basic radiation protection concept, quantitative feeling of the radiation dose, and the biological effects

  7. SU-F-T-62: Three-Dimensional Dosimetric Gamma Analysis for Impacts of Tissue Inhomogeneity Using Monte Carlo Simulation in Intracavitary Brachytheray for Cervix Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Tran Thi Thao; Nakamoto, Takahiro; Shibayama, Yusuke

    Purpose: The aim of this study was to investigate the impacts of tissue inhomogeneity on dose distributions using a three-dimensional (3D) gamma analysis in cervical intracavitary brachytherapy using Monte Carlo (MC) simulations. Methods: MC simulations for comparison of dose calculations were performed in a water phantom and a series of CT images of a cervical cancer patient (stage: Ib; age: 27) by employing a MC code, Particle and Heavy Ion Transport Code System (PHIT) version 2.73. The {sup 192}Ir source was set at fifteen dwell positions, according to clinical practice, in an applicator consisting of a tandem and two ovoids.more » Dosimetric comparisons were performed for the dose distributions in the water phantom and CT images by using gamma index image and gamma pass rate (%). The gamma index is the minimum Euclidean distance between two 3D spatial dose distributions of the water phantom and CT images in a same space. The gamma pass rates (%) indicate the percentage of agreement points, which mean that two dose distributions are similar, within an acceptance criteria (3 mm/3%). The volumes of physical and clinical interests for the gamma analysis were a whole calculated volume and a region larger than t% of a dose (close to a target), respectively. Results: The gamma pass rates were 77.1% for a whole calculated volume and 92.1% for a region within 1% dose region. The differences of 7.7% to 22.9 % between two dose distributions in the water phantom and CT images were found around the applicator region and near the target. Conclusion: This work revealed the large difference on the dose distributions near the target in the presence of the tissue inhomogeneity. Therefore, the tissue inhomogeneity should be corrected in the dose calculation for clinical treatment.« less

  8. A mechanistic model of environmental oxygen influence on the deterministic effects to human skin from space radiations

    NASA Astrophysics Data System (ADS)

    Flores-McLaughlin, John

    During human spaceflight missions, controlled variation of atmospheric pressure and oxygen concentration from a sea-level based normal to hyperoxic levels may occur as part of operational procedure. This activity is of interest because it provides the relevant radiation exposure and dynamic oxygen concentration parameters that may lead to varying radiation sensitivity in the skin and other organs. Tumor hypoxia has been indicated as a primary factor in the decrease in efficacy of radiation therapy. These oxygen concentration effects have been largely demonstrated with low-LET radiations and to a lesser degree with high-LET primary radiations such as protons and heavy ions common in space exposure. In order to analyze the variation of oxygen concentration in human skin from spaceflight activities, a mathematical model of oxygen transport through the human cardiorespiratory system with pulmonary and cutaneous intake was implemented. Oxygen concentration was simulated at the various skin layers, from dermis to epidermis. Skin surface radiation doses and spectra from relatively high flux Solar Particle Events (SPEs) were calculated by the PHITS radiation transport code over a range of spacecraft and spacesuit thicknesses in terms of aluminum equivalence. A series of anatomical skin and shielding thicknesses were chosen to encompass the scope of radiation exposure levels as indicated by existing NASA skin phantom studies. To model the influence of oxygen with radiation exposure, microdosimetric oxygen fixation simulations were implemented using the Monte-Carlo-Damage-Simulation (MCDS) code. From these outputs, occurrence of DNA double strand breaks (DSBs) and relative biological effect (RBE) from radiation exposure with oxygen concentration dependence was established and correlated to spaceflight activities. It was determined that minimal but observable oxygen concentration transients occur in skin during environmental oxygen changes in spaceflight. The most significant transients occurred in the thickest epidermal layers with relatively high amounts of diffusion. Accordingly, these thickest epidermal layers also showed the greatest spaceflight induced transients of RBE relative to sea-level based atmosphere exposures.

  9. Effect of secondary electron generation on dose enhancement in Lipiodol with and without a flattening filter.

    PubMed

    Kawahara, Daisuke; Ozawa, Shuichi; Saito, Akito; Kimura, Tomoki; Suzuki, Tatsuhiko; Tsuneda, Masato; Tanaka, Sodai; Nakashima, Takeo; Ohno, Yoshimi; Murakami, Yuji; Nagata, Yasushi

    2018-03-01

    Lipiodol, which was used in transcatheter arterial chemoembolization before liver stereotactic body radiation therapy (SBRT), remains in SBRT. Previous we reported the dose enhancement in Lipiodol using 10 MV (10×) FFF beam. In this study, we compared the dose enhancement in Lipiodol and evaluated the probability of electron generation (PEG) for the dose enhancement using flattening filter (FF) and flattening filter free (FFF) beams. FF and FFF for 6 MV (6×) and 10× beams were delivered by TrueBeam. The dose enhancement factor (DEF), energy spectrum, and PEG was calculated using Monte Carlo (MC) code BEAMnrc and heavy ion transport code system (PHITS). DEFs for FF and FFF 6× beams were 7.0% and 17.0% at the center of Lipiodol (depth, 6.5 cm). DEFs for FF and FFF 10× beams were 8.2% and 10.5% at the center of Lipiodol. Spectral analysis revealed that the FFF beams contained more low-energy (0-0.3 MeV) electrons than the FF beams, and the FF beams contained more high-energy (>0.3 MeV) electrons than the FFF beams in Lipiodol. The difference between FFF and FF beam DEFs was larger for 6× than for 10×. This occurred because the 10× beams contained more high-energy electrons. The PEGs for photoelectric absorption and Compton scattering for the FFF beams were higher than those for the FF beams. The PEG for the photoelectric absorption was higher than that for Compton scattering. FFF beam contained more low-energy photons and it contributed to the dose enhancement. Energy spectra and PEGs are useful for analyzing the mechanisms of dose enhancement. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. PHITS simulations of the Protective curtain experiment onboard the Service module of ISS: Comparison with absorbed doses measured with TLDs

    NASA Astrophysics Data System (ADS)

    Ploc, Ondřej; Sihver, Lembit; Kartashov, Dmitry; Shurshakov, Vyacheslav; Tolochek, Raisa

    2013-12-01

    "Protective curtain" was the physical experiment onboard the International Space Station (ISS) aimed on radiation measurement of the dose - reducing effect of the additional shielding made of hygienic water-soaked wipes and towels placed on the wall in the crew cabin of the Service module Zvezda. The measurements were performed with 12 detector packages composed of thermoluminescent detectors (TLDs) and plastic nuclear track detectors (PNTDs) placed at the Protective curtain, so that they created pairs of shielded and unshielded detectors.

  11. Design of an electron-accelerator-driven compact neutron source for non-destructive assay

    NASA Astrophysics Data System (ADS)

    Murata, A.; Ikeda, S.; Hayashizaki, N.

    2017-09-01

    The threat of nuclear and radiological terrorism remains one of the greatest challenges to international security, and the threat is constantly evolving. In order to prevent nuclear terrorism, it is important to avoid unlawful import of nuclear materials, such as uranium and plutonium. Development of technologies for non-destructive measurement, detection and recognition of nuclear materials is essential for control at national borders. At Tokyo Institute of Technology, a compact neutron source system driven by an electron-accelerator has been designed for non-destructive assay (NDA). This system is composed of a combination of an S-band (2.856 GHz) RF-gun, a tungsten target to produce photons by bremsstrahlung, a beryllium target, which is suitable for use in generating neutrons because of the low threshold energy of photonuclear reactions, and a moderator to thermalize the fast neutrons. The advantage of this system can accelerate a short pulse beam with a pulse width less than 1 μs which is difficult to produce by neutron generators. The amounts of photons and neutron produced by electron beams were simulated using the Monte Carlo simulation code PHITS 2.82. When the RF-gun is operated with an average electron beam current of 0.1 mA, it is expected that the neutron intensities are 1.19 × 109 n/s and 9.94 × 109 n/s for incident electron beam energies of 5 MeV and 10 MeV, respectively.

  12. Leakage of radioactive materials from particle accelerator facilities by non-radiation disasters like fire and flooding and its environmental impacts

    NASA Astrophysics Data System (ADS)

    Lee, A.; Jung, N. S.; Mokhtari Oranj, L.; Lee, H. S.

    2018-06-01

    The leakage of radioactive materials generated at particle accelerator facilities is one of the important issues in the view of radiation safety. In this study, fire and flooding at particle accelerator facilities were considered as the non-radiation disasters which result in the leakage of radioactive materials. To analyse the expected effects at each disaster, the case study on fired and flooded particle accelerator facilities was carried out with the property investigation of interesting materials presented in the accelerator tunnel and the activity estimation. Five major materials in the tunnel were investigated: dust, insulators, concrete, metals and paints. The activation levels on the concerned materials were calculated using several Monte Carlo codes (MCNPX 2.7+SP-FISPACT 2007, FLUKA 2011.4c and PHITS 2.64+DCHAIN-SP 2001). The impact weight to environment was estimated for the different beam particles (electron, proton, carbon and uranium) and the different beam energies (100, 430, 600 and 1000 MeV/nucleon). With the consideration of the leakage path of radioactive materials due to fire and flooding, the activation level of selected materials, and the impacts to the environment were evaluated. In the case of flooding, dust, concrete and metal were found as a considerable object. In the case of fire event, dust, insulator and paint were the major concerns. As expected, the influence of normal fire and flooding at electron accelerator facilities would be relatively low for both cases.

  13. A Systems Engineering Approach to Allocate Resources Between Protection and Sensors for Ground Systems for Offensive Operations in an Urban Environment

    DTIC Science & Technology

    2014-09-01

    progress. In the early ages, armor started out as merely thick hides or leather that was draped over the body for protection. As human kind evolved, the...parameters. Quantity Weapons Range (m) Armour Penetration (mm RHA) Std Dev P(hit) Quantity Weapons Range (m) Armour Penetration (mm RHA) Std...183 - - Inherent Armour Thickness (mm of RHA) 1,000 1,000 500 450 AGENT IFV 3 Stryker 2 AFV Bradley BMP-2 M1A2 MBT T-90 1 2 3 AH 8 3AH-64D Class

  14. Fast skin dose estimation system for interventional radiology

    PubMed Central

    Takata, Takeshi; Kotoku, Jun’ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-01-01

    Abstract To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient’s computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7–7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods. PMID:29136194

  15. Fast skin dose estimation system for interventional radiology.

    PubMed

    Takata, Takeshi; Kotoku, Jun'ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-03-01

    To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient's computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7-7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods.

  16. Space Radiation Dosimetry to Evaluate the Effect of Polyethylene Shielding in the Russian Segment of the International Space Station

    NASA Astrophysics Data System (ADS)

    Nagamatsu, Aiko; Casolino, Marco; Larsson, Oscar; Ito, Tsuyoshi; Yasuda, Nakahiro; Kitajo, Keiichi; Shimada, Ken; Takeda, Kazuo; Tsuda, Shuichi; Sato, Tatsuhiko

    As a part of the Alteino Long Term Cosmic Ray measurements on board the International Space Station (ALTCRISS) project, the shielding effect of polyethylene (PE) were evaluated in the Russian segment of the ISS, using active and passive dosimeter systems covered with or without PE shielding. For the passive dosimeter system, PADLES (Passive Dosimeter for Life-Science and Experiments in Space) was used in the project, which consists of a Thermo-Luminescent Dosimeters (TLD) and CR-39 Plastic Nuclear Track Detectors (PNTDs) attached to a radiator. Not only CR-39 PNTD itself but also a tissue equivalent material, NAN-JAERI, were employed as the radiator in order to investigate whether CR-39 PNTD can be used as a surrogate of tissue equivalent material in space dosimetry or not. The agreements between the doses measured by PADLES with CR-39 PNTD and NAN-JAERI radiators were quite satisfactorily, indicating the tissue-equivalent dose can be measured by conventional PADLES even though CR-39 PNTD is not perfect tissue-equivalent material. It was found that the shielding effect of PE varies with location inside the spacecraft: it became less significant with an increase of the mean thickness of the wall. This tendency was also verified by Monte Carlo simulation using the PHITS code. Throughout the flight experiments, in a series of four phases in the ALTCRISS project from December 2005 to October 2007, we assessed the ability of PE to decrease radiation doses in Low Earth Orbit(LEO).

  17. Measurements and Monte Carlo calculations of forward-angle secondary-neutron-production cross-sections for 137 and 200 MeV proton-induced reactions in carbon

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke; Hagiwara, Masayuki; Matsumoto, Tetsuro; Masuda, Akihiko; Iwase, Hiroshi; Yashima, Hiroshi; Shima, Tatsushi; Tamii, Atsushi; Nakamura, Takashi

    2012-10-01

    Secondary neutron-production double-differential cross-sections (DDXs) have been measured from interactions of 137 MeV and 200 MeV protons in a natural carbon target. The data were measured between 0° and 25° in the laboratory. DDXs were obtained with high energy resolution in the energy region from 3 MeV up to the maximum energy. The experimental data of 137 MeV protons at 10° and 25° were in good agreement with that of 113 MeV protons at 7.5° and 30° at LANSCE/WNR in the energy region below 80 MeV. Benchmark calculations were carried out with the PHITS code using the evaluated nuclear data files of JENDL/HE-2007 and ENDF/B-VII, and the theoretical models of Bertini-GEM and ISOBAR-GEM. For the 137 MeV proton incidence, calculations using JENDL/HE-2007 generally reproduced the shape and the intensity of experimental spectra well including the ground state of the 12N state produced by the 12C(p,n)12N reaction. For the 200 MeV proton incidence, all calculated results underestimated the experimental data by the factor of two except for the calculated result using ISOBAR model. ISOBAR predicts the nucleon emission to the forward angles qualitatively better than the Bertini model. These experimental data will be useful to evaluate the carbon data and as benchmark data for investigating the validity of the Monte Carlo simulation for the shielding design of accelerator facilities.

  18. Spallation reaction study for fission products in nuclear waste: Cross section measurements for 137Cs, 90Sr and 107Pd on proton and deuteron

    NASA Astrophysics Data System (ADS)

    Wang, He; Otsu, Hideaki; Sakurai, Hiroyoshi; Ahn, DeukSoon; Aikawa, Masayuki; Ando, Takashi; Araki, Shouhei; Chen, Sidong; Chiga, Nobuyuki; Doornenbal, Pieter; Fukuda, Naoki; Isobe, Tadaaki; Kawakami, Shunsuke; Kawase, Shoichiro; Kin, Tadahiro; Kondo, Yosuke; Koyama, Shupei; Kubono, Shigeru; Maeda, Yukie; Makinaga, Ayano; Matsushita, Masafumi; Matsuzaki, Teiichiro; Michimasa, Shinichiro; Momiyama, Satoru; Nagamine, Shunsuke; Nakamura, Takashi; Nakano, Keita; Niikura, Megumi; Ozaki, Tomoyuki; Saito, Atsumi; Saito, Takeshi; Shiga, Yoshiaki; Shikata, Mizuki; Shimizu, Yohei; Shimoura, Susumu; Sumikama, Toshiyuki; Söderström, Pär-Anders; Suzuki, Hiroshi; Takeda, Hiroyuki; Takeuchi, Satoshi; Taniuchi, Ryo; Togano, Yasuhiro; Tsubota, Junichi; Uesaka, Meiko; Watanabe, Yasushi; Watanabe, Yukinobu; Wimmer, Kathrin; Yamamoto, Tatsuya; Yoshida, Koichi

    2017-09-01

    Spallation reactions for the long-lived fission products 137Cs, 90Sr and 107Pd have been studied for the purpose of nuclear waste transmutation. The cross sections on the proton- and deuteron-induced spallation were obtained in inverse kinematics at the RIKEN Radioactive Isotope Beam Factory. Both the target and energy dependences of cross sections have been investigated systematically. and the cross-section differences between the proton and deuteron are found to be larger for lighter fragments. The experimental data are compared with the SPACS semi-empirical parameterization and the PHITS calculations including both the intra-nuclear cascade and evaporation processes.

  19. Light-ion Production from O, Si, Fe and Bi Induced by 175 MeV Quasi-monoenergetic Neutrons

    NASA Astrophysics Data System (ADS)

    Bevilacqua, R.; Pomp, S.; Jansson, K.; Gustavsson, C.; Österlund, M.; Simutkin, V.; Hayashi, M.; Hirayama, S.; Naitou, Y.; Watanabe, Y.; Hjalmarsson, A.; Prokofiev, A.; Tippawan, U.; Lecolley, F.-R.; Marie, N.; Leray, S.; David, J.-C.; Mashnik, S.

    2014-05-01

    We have measured double-differential cross sections in the interaction of 175 MeV quasi-monoenergetic neutrons with O, Si, Fe and Bi. We have compared these results with model calculations with INCL4.5-Abla07, MCNP6 and TALYS-1.2. We have also compared our data with PHITS calculations, where the pre-equilibrium stage of the reaction was accounted respectively using the JENDL/HE-2007 evaluated data library, the quantum molecular dynamics model (QMD) and a modified version of QMD (MQMD) to include a surface coalescence model. The most crucial aspect is the formation and emission of composite particles in the pre-equilibrium stage.

  20. Directed educational training improves coding and billing skills for residents.

    PubMed

    Benke, James R; Lin, Sandra Y; Ishman, Stacey L

    2013-03-01

    To determine if coding and billing acumen improves after a single directed educational training session. Case-control series. Fourteen otolaryngology practitioners including trainees each completed two clinical scenarios before and after a directed educational session covering basic skills and common mistakes in otolaryngology billing and coding. Ten practitioners had never coded before; while, four regularly billed and coded in a clinical setting. Individuals with no previous billing experience had a mean score of 54% (median 55%) before the educational session which was significantly lower than that of the experienced billers who averaged 82% (median 83%, p=0.002). After the educational billing and coding session, the inexperienced billers mean score improved to 62% (median, 67%) which was still statistically lower than that of the experienced billers who averaged 76% (median 75%, p=0.039). The inexperienced billers demonstrated a significant improvement in their total score after the intervention (P=0.019); however, the change observed in experienced billers before and after the educational intervention was not significant (P=0.469). Billing and coding skill was improved after a single directed education session. Residents, who are not responsible for regular billing and coding, were found to have the greatest improvement in skill. However, providers who regularly bill and code had no significant improvement after this session. These data suggest that a single 90min billing and coding education session is effective in preparing those with limited experience to competently bill and code. Copyright © 2012. Published by Elsevier Ireland Ltd.

  1. Development of a Muon Rotating Target for J-PARC/MUSE

    NASA Astrophysics Data System (ADS)

    Makimura, Shunsuke; Kobayashi, Yasuo; Miyake, Yasuhiro; Kawamura, Naritoshi; Strasser, Patrick; Koda, Akihiro; Shimomura, Koichiro; Fujimori, Hiroshi; Nishiyama, Kusuo; Kato, Mineo; Kojima, Kenji; Higemoto, Wataru; Ito, Takashi; Shimizu, Ryou; Kadono, Ryosuke

    At the J-PARC muon science facility (J-PARC/MUSE), a graphite target with a thickness of 20 mm has been used in vacuum to obtain an intense pulsed muon beam from the RCS 3-GeV proton beam [1], [2]. In the current design, the target frame is constructed using copper with a stainless steel tube embedded for water cooling. The energy deposited by the proton beam at 1 MW is evaluated to be 3.3 kW on the graphite target and 600 W on the copper frame by a Monte-Carlo simulation code, PHITS [3]. Graphite materials are known to lose their crystal structure and can be shrunk under intense proton beam irradiation. Consequently, the lifetime of the muon target is essentially determined by the radiation damage in graphite, and is evaluated to be half a year [4]. Hence, we are planning to distribute the radiation damage by rotating a graphite wheel. Although the lifetime of graphite in this case will be more than 10 years, the design of the bearing must be carefully considered. Because the bearing in JPARC/MUSE is utilized in vacuum, under high radiation, and at high temperature, an inorganic and solid lubricant must be applied to the bearing. Simultaneously, the temperature of the bearing must also be decreased to extend the lifetime. In 2009, a mock-up of the Muon Rotating Target, which could heat up and rotate a graphite wheel, was fabricated. Then several tests were started to select the lubricant and to determine the structure of the Muon Rotating Target, the control system and so on. In this report, the present status of the Muon Rotating Target for J-PARC/MUSE, especially the development of a rotation system in vacuum, is described.

  2. Monte Carlo study of out-of-field exposure in carbon-ion radiotherapy with a passive beam: Organ doses in prostate cancer treatment.

    PubMed

    Yonai, Shunsuke; Matsufuji, Naruhiro; Akahane, Keiichi

    2018-04-23

    The aim of this work was to estimate typical dose equivalents to out-of-field organs during carbon-ion radiotherapy (CIRT) with a passive beam for prostate cancer treatment. Additionally, sensitivity analyses of organ doses for various beam parameters and phantom sizes were performed. Because the CIRT out-of-field dose depends on the beam parameters, the typical values of those parameters were determined from statistical data on the target properties of patients who received CIRT at the Heavy-Ion Medical Accelerator in Chiba (HIMAC). Using these typical beam-parameter values, out-of-field organ dose equivalents during CIRT for typical prostate treatment were estimated by Monte Carlo simulations using the Particle and Heavy-Ion Transport Code System (PHITS) and the ICRP reference phantom. The results showed that the dose decreased with distance from the target, ranging from 116 mSv in the testes to 7 mSv in the brain. The organ dose equivalents per treatment dose were lower than those either in 6-MV intensity-modulated radiotherapy or in brachytherapy with an Ir-192 source for organs within 40 cm of the target. Sensitivity analyses established that the differences from typical values were within ∼30% for all organs, except the sigmoid colon. The typical out-of-field organ dose equivalents during passive-beam CIRT were shown. The low sensitivity of the dose equivalent in organs farther than 20 cm from the target indicated that individual dose assessments required for retrospective epidemiological studies may be limited to organs around the target in cases of passive-beam CIRT for prostate cancer. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Research on pre-processing of QR Code

    NASA Astrophysics Data System (ADS)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  4. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    PubMed

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem of ICD-9 coding accuracy by physicians and offers an approach to effectively address this shortcoming. Copyright © 2015. Published by Elsevier Inc.

  5. Production of an 15O beam using a stable oxygen ion beam for in-beam PET imaging

    NASA Astrophysics Data System (ADS)

    Mohammadi, Akram; Yoshida, Eiji; Tashima, Hideaki; Nishikido, Fumihiko; Inaniwa, Taku; Kitagawa, Atsushi; Yamaya, Taiga

    2017-03-01

    In advanced ion therapy, the 15O ion beam is a promising candidate to treat hypoxic tumors and simultaneously monitor the delivered dose to a patient using PET imaging. This study aimed at production of an 15O beam by projectile fragmentation of a stable 16O beam in an optimal material, followed by in-beam PET imaging using a prototype OpenPET system, which was developed in the authors' group. The study was carried out in three steps: selection of the optimal target based on the highest production rate of 15O fragments; experimental production of the beam using the optimal target in the Heavy Ion Medical Accelerator Chiba (HIMAC) secondary beam course; and realization of in-beam PET imaging for the produced beam. The optimal target evaluations were done using the Monte Carlo simulation code PHITS. The fluence and mean energy of the secondary particles were simulated and the optimal target was selected based on the production rate of 15O fragments. The highest production rate of 15O was observed for a liquid hydrogen target, 3.27% for a 53 cm thick target from the 16O beam of 430 MeV/u. Since liquid hydrogen is not practically applicable in the HIMAC secondary beam course a hydrogen-rich polyethylene material, which was the second optimal target from the simulation results, was selected as the experimental target. Three polyethylene targets with thicknesses of 5, 11 or 14 cm were used to produce the 15O beam without any degrader in the beam course. The highest production rate was measured as around 0.87% for the 11 cm thick polyethylene target from the 16O beam of 430 MeV/u when the angular acceptance and momentum acceptance were set at ±13 mrad and ±2.5%, respectively. The purity of the produced beam for the three targets were around 75%, insufficient for clinical application, but it was increased to 97% by inserting a wedge shape aluminum degrader with a thickness of 1.76 cm into the beam course and that is sufficiently high. In-beam PET imaging was also performed for all produced beams using the OpenPET system. The purity improvement of the produced 15O beams was confirmed from the PET images.

  6. [National Antimicrobial Resistance Surveillance System (NAMRSS) external quality assessment studies: 2011-2016].

    PubMed

    Süzük Yıldız, Serap; Şimşek, Hüsniye; Çöplü, Nilay; Gülay, Zeynep

    2017-07-01

    Establishment of sustainable and evidence-based surveillance systems are recommended for prevention of microbial resistance by the World Health Organization (WHO). As a necessity of these surveillance systems, participants are recommended to implement an external quality assessment (EQA) program. In this scope, National Antimicrobial Resistance Surveillance System (NARSS) has been established within the Public Health Institute of Turkey (PHIT) in our country since 2011. In the scope of this surveillance, NARSS EQA program has been implemented in a cycle per year and four isolates were sent to participants per cycle every year since 2011. In this study, it was aimed to evaluate the six years results of the EQA programs being implemented on NARSS participants between 2011 and 2016. The surveillance system consisted of 118 laboratories. Escherichia coli, Klebsiella pneumoniae, Pseudomonas aeruginosa, Staphylococcus aureus, Streptococcus pneumoniae, Enterococcus faecium/faecalis and Acinetobacter baumannii bacteria included in scope of the surveillance were sent to participants. Identification of bacteria to the species level, verification of the antibiotic susceptibility test results and existence of specified resistance of the isolates performed with valid test methods required from the participants. Identified isolates were cultured with routine microbiological methods and sent to participants in ambient temperature in triple carrying pouches inside suitable carrying media via PTT Cargo. The results were entered by means of passwords prepared by PHIT and sent to the web based system. The analysis of results were made with SPSS program. A total of twenty-three isolates were sent to participants between 2011 and 2016. It was determined that participants commonly preferred automated systems for bacterial identification and antibiotic sensitivity test results. The use of MALDI TOF MS system was determined to be raised up to 15.65% in 2016. It has been determined that usually little mistakes were done in bacterial identification but the error rate was high especially in antimicrobial susceptibility test results with close clinical threshold values. Although not required for antibiotic susceptibility test results, it was determined that phenotypic tests have been used more widely in determining the specific resistance mechanisms that are important for epidemiological data. It was determined that 80% of participants have used EUCAST standards in 2016. As a result of this research, we have observed that EQA studies of NARSS EQA are a good performance tool for sustainable and evidence based surveillance studies, that the national antimicrobial resistance data quality is sufficiently good and that the data can be shared on international platforms. In addition, the regular maintenance of national surveillance studies shown that laboratories have positive reflections on self improvement in achieving up to date and accurate results.

  7. Improving data quality across 3 sub-Saharan African countries using the Consolidated Framework for Implementation Research (CFIR): results from the African Health Initiative.

    PubMed

    Gimbel, Sarah; Mwanza, Moses; Nisingizwe, Marie Paul; Michel, Cathy; Hirschhorn, Lisa

    2017-12-21

    High-quality data are critical to inform, monitor and manage health programs. Over the seven-year African Health Initiative of the Doris Duke Charitable Foundation, three of the five Population Health Implementation and Training (PHIT) partnership projects in Mozambique, Rwanda, and Zambia introduced strategies to improve the quality and evaluation of routinely-collected data at the primary health care level, and stimulate its use in evidence-based decision-making. Using the Consolidated Framework for Implementation Research (CFIR) as a guide, this paper: 1) describes and categorizes data quality assessment and improvement activities of the projects, and 2) identifies core intervention components and implementation strategy adaptations introduced to improve data quality in each setting. The CFIR was adapted through a qualitative theme reduction process involving discussions with key informants from each project, who identified two domains and ten constructs most relevant to the study aim of describing and comparing each country's data quality assessment approach and implementation process. Data were collected on each project's data quality improvement strategies, activities implemented, and results via a semi-structured questionnaire with closed and open-ended items administered to health management information systems leads in each country, with complementary data abstraction from project reports. Across the three projects, intervention components that aligned with user priorities and government systems were perceived to be relatively advantageous, and more readily adapted and adopted. Activities that both assessed and improved data quality (including data quality assessments, mentorship and supportive supervision, establishment and/or strengthening of electronic medical record systems), received higher ranking scores from respondents. Our findings suggest that, at a minimum, successful data quality improvement efforts should include routine audits linked to ongoing, on-the-job mentoring at the point of service. This pairing of interventions engages health workers in data collection, cleaning, and analysis of real-world data, and thus provides important skills building with on-site mentoring. The effect of these core components is strengthened by performance review meetings that unify multiple health system levels (provincial, district, facility, and community) to assess data quality, highlight areas of weakness, and plan improvements.

  8. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less

  9. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  10. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    PubMed

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  11. Improve load balancing and coding efficiency of tiles in high efficiency video coding by adaptive tile boundary

    NASA Astrophysics Data System (ADS)

    Chan, Chia-Hsin; Tu, Chun-Chuan; Tsai, Wen-Jiin

    2017-01-01

    High efficiency video coding (HEVC) not only improves the coding efficiency drastically compared to the well-known H.264/AVC but also introduces coding tools for parallel processing, one of which is tiles. Tile partitioning is allowed to be arbitrary in HEVC, but how to decide tile boundaries remains an open issue. An adaptive tile boundary (ATB) method is proposed to select a better tile partitioning to improve load balancing (ATB-LoadB) and coding efficiency (ATB-Gain) with a unified scheme. Experimental results show that, compared to ordinary uniform-space partitioning, the proposed ATB can save up to 17.65% of encoding times in parallel encoding scenarios and can reduce up to 0.8% of total bit rates for coding efficiency.

  12. Error Control Coding Techniques for Space and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.

    1998-01-01

    It is well known that the BER performance of a parallel concatenated turbo-code improves roughly as 1/N, where N is the information block length. However, it has been observed by Benedetto and Montorsi that for most parallel concatenated turbo-codes, the FER performance does not improve monotonically with N. In this report, we study the FER of turbo-codes, and the effects of their concatenation with an outer code. Two methods of concatenation are investigated: across several frames and within each frame. Some asymmetric codes are shown to have excellent FER performance with an information block length of 16384. We also show that the proposed outer coding schemes can improve the BER performance as well by eliminating pathological frames generated by the iterative MAP decoding process.

  13. The application of LDPC code in MIMO-OFDM system

    NASA Astrophysics Data System (ADS)

    Liu, Ruian; Zeng, Beibei; Chen, Tingting; Liu, Nan; Yin, Ninghao

    2018-03-01

    The combination of MIMO and OFDM technology has become one of the key technologies of the fourth generation mobile communication., which can overcome the frequency selective fading of wireless channel, increase the system capacity and improve the frequency utilization. Error correcting coding introduced into the system can further improve its performance. LDPC (low density parity check) code is a kind of error correcting code which can improve system reliability and anti-interference ability, and the decoding is simple and easy to operate. This paper mainly discusses the application of LDPC code in MIMO-OFDM system.

  14. Continuous Codes and Standards Improvement (CCSI)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivkin, Carl H; Burgess, Robert M; Buttner, William J

    2015-10-21

    As of 2014, the majority of the codes and standards required to initially deploy hydrogen technologies infrastructure in the United States have been promulgated. These codes and standards will be field tested through their application to actual hydrogen technologies projects. Continuous codes and standards improvement (CCSI) is a process of identifying code issues that arise during project deployment and then developing codes solutions to these issues. These solutions would typically be proposed amendments to codes and standards. The process is continuous because as technology and the state of safety knowledge develops there will be a need to monitor the applicationmore » of codes and standards and improve them based on information gathered during their application. This paper will discuss code issues that have surfaced through hydrogen technologies infrastructure project deployment and potential code changes that would address these issues. The issues that this paper will address include (1) setback distances for bulk hydrogen storage, (2) code mandated hazard analyses, (3) sensor placement and communication, (4) the use of approved equipment, and (5) system monitoring and maintenance requirements.« less

  15. Recent improvements of reactor physics codes in MHI

    NASA Astrophysics Data System (ADS)

    Kosaka, Shinya; Yamaji, Kazuya; Kirimura, Kazuki; Kamiyama, Yohei; Matsumoto, Hideki

    2015-12-01

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO's Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipated transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.

  16. Recent improvements of reactor physics codes in MHI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosaka, Shinya, E-mail: shinya-kosaka@mhi.co.jp; Yamaji, Kazuya; Kirimura, Kazuki

    2015-12-31

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO’s Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipatedmore » transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.« less

  17. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  18. DCT based interpolation filter for motion compensation in HEVC

    NASA Astrophysics Data System (ADS)

    Alshin, Alexander; Alshina, Elena; Park, Jeong Hoon; Han, Woo-Jin

    2012-10-01

    High Efficiency Video Coding (HEVC) draft standard has a challenging goal to improve coding efficiency twice compare to H.264/AVC. Many aspects of the traditional hybrid coding framework were improved during new standard development. Motion compensated prediction, in particular the interpolation filter, is one area that was improved significantly over H.264/AVC. This paper presents the details of the interpolation filter design of the draft HEVC standard. The coding efficiency improvements over H.264/AVC interpolation filter is studied and experimental results are presented, which show a 4.0% average bitrate reduction for Luma component and 11.3% average bitrate reduction for Chroma component. The coding efficiency gains are significant for some video sequences and can reach up 21.7%.

  19. Triple ionization chamber method for clinical dose monitoring with a Be-covered Li BNCT field.

    PubMed

    Nguyen, Thanh Tat; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Nguyen, Chien Cong; Endo, Satoru

    2016-11-01

    Fast neutron, gamma-ray, and boron doses have different relative biological effectiveness (RBE). In boron neutron capture therapy (BNCT), the clinical dose is the total of these dose components multiplied by their RBE. Clinical dose monitoring is necessary for quality assurance of the irradiation profile; therefore, the fast neutron, gamma-ray, and boron doses should be separately monitored. To estimate these doses separately, and to monitor the boron dose without monitoring the thermal neutron fluence, the authors propose a triple ionization chamber method using graphite-walled carbon dioxide gas (C-CO 2 ), tissue-equivalent plastic-walled tissue-equivalent gas (TE-TE), and boron-loaded tissue-equivalent plastic-walled tissue-equivalent gas [TE(B)-TE] chambers. To use this method for dose monitoring for a neutron and gamma-ray field moderated by D 2 O from a Be-covered Li target (Be-covered Li BNCT field), the relative sensitivities of these ionization chambers are required. The relative sensitivities of the TE-TE, C-CO 2 , and TE(B)-TE chambers to fast neutron, gamma-ray, and boron doses are calculated with the particle and heavy-ion transport code system (PHITS). The relative sensitivity of the TE(B)-TE chamber is calculated with the same method as for the TE-TE and C-CO 2 chambers in the paired chamber method. In the Be-covered Li BNCT field, the relative sensitivities of the ionization chambers to fast neutron, gamma-ray, and boron doses are calculated from the kerma ratios, mass attenuation coefficient tissue-to-wall ratios, and W-values. The Be-covered Li BNCT field consists of neutrons and gamma-rays which are emitted from a Be-covered Li target, and this resultant field is simulated by using PHITS with the cross section library of ENDF-VII. The kerma ratios and mass attenuation coefficient tissue-to-wall ratios are determined from the energy spectra of neutrons and gamma-rays in the Be-covered Li BNCT field. The W-value is calculated from recoil charged particle spectra by the collision of neutrons and gamma-rays with the wall and gas materials of the ionization chambers in the gas cavities of TE-TE, C-CO 2 , and TE(B)-TE chambers ( 10 B concentrations of 10, 50, and 100 ppm in the TE-wall). The calculated relative sensitivity of the C-CO 2 chamber to the fast neutron dose in the Be-covered Li BNCT field is 0.029, and those of the TE-TE and TE(B)-TE chambers are both equal to 0.965. The relative sensitivities of the C-CO 2 , TE-TE, and TE(B)-TE chambers to the gamma-ray dose in the Be-covered Li BNCT field are all 1 within the 1% calculation uncertainty. The relative sensitivities of TE(B)-TE to boron dose with concentrations of 10, 50, and 100 ppm 10 B are calculated to be 0.865 times the ratio of the in-tumor to in-chamber wall boron concentration. The fast neutron, gamma-ray, and boron doses of a tumor in-air can be separately monitored by the triple ionization chamber method in the Be-covered Li BNCT field. The results show that these doses can be easily converted to the clinical dose with the depth correction factor in the body and the RBE.

  20. Test code for the assessment and improvement of Reynolds stress models

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.; Viegas, J. R.; Vandromme, D.; Minh, H. HA

    1987-01-01

    An existing two-dimensional, compressible flow, Navier-Stokes computer code, containing a full Reynolds stress turbulence model, was adapted for use as a test bed for assessing and improving turbulence models based on turbulence simulation experiments. To date, the results of using the code in comparison with simulated channel flow and over an oscillating flat plate have shown that the turbulence model used in the code needs improvement for these flows. It is also shown that direct simulation of turbulent flows over a range of Reynolds numbers are needed to guide subsequent improvement of turbulence models.

  1. A Comprehensive Approach to Convert a Radiology Department From Coding Based on International Classification of Diseases, Ninth Revision, to Coding Based on International Classification of Diseases, Tenth Revision.

    PubMed

    McBee, Morgan P; Laor, Tal; Pryor, Rebecca M; Smith, Rachel; Hardin, Judy; Ulland, Lisa; May, Sally; Zhang, Bin; Towbin, Alexander J

    2018-02-01

    The purpose of this study was to adapt our radiology reports to provide the documentation required for specific International Classification of Diseases, tenth rev (ICD-10) diagnosis coding. Baseline data were analyzed to identify the reports with the greatest number of unspecified ICD-10 codes assigned by computer-assisted coding software. A two-part quality improvement initiative was subsequently implemented. The first component involved improving clinical histories by utilizing technologists to obtain information directly from the patients or caregivers, which was then imported into the radiologist's report within the speech recognition software. The second component involved standardization of report terminology and creation of four different structured report templates to determine which yielded the fewest reports with an unspecified ICD-10 code assigned by an automated coding engine. In all, 12,077 reports were included in the baseline analysis. Of these, 5,151 (43%) had an unspecified ICD-10 code. The majority of deficient reports were for radiographs (n = 3,197; 62%). Inadequacies included insufficient clinical history provided and lack of detailed fracture descriptions. Therefore, the focus was standardizing terminology and testing different structured reports for radiographs obtained for fractures. At baseline, 58% of radiography reports contained a complete clinical history with improvement to >95% 8 months later. The total number of reports that contained an unspecified ICD-10 code improved from 43% at baseline to 27% at completion of this study (P < .0001). The number of radiology studies with a specific ICD-10 code can be improved through quality improvement methodology, specifically through the use of technologist-acquired clinical histories and structured reporting. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. A novel concatenated code based on the improved SCG-LDPC code for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xie, Ya; Wang, Lin; Huang, Sheng; Wang, Yong

    2013-01-01

    Based on the optimization and improvement for the construction method of systematically constructed Gallager (SCG) (4, k) code, a novel SCG low density parity check (SCG-LDPC)(3969, 3720) code to be suitable for optical transmission systems is constructed. The novel SCG-LDPC (6561,6240) code with code rate of 95.1% is constructed by increasing the length of SCG-LDPC (3969,3720) code, and in a way, the code rate of LDPC codes can better meet the high requirements of optical transmission systems. And then the novel concatenated code is constructed by concatenating SCG-LDPC(6561,6240) code and BCH(127,120) code with code rate of 94.5%. The simulation results and analyses show that the net coding gain (NCG) of BCH(127,120)+SCG-LDPC(6561,6240) concatenated code is respectively 2.28 dB and 0.48 dB more than those of the classic RS(255,239) code and SCG-LDPC(6561,6240) code at the bit error rate (BER) of 10-7.

  3. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  4. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less

  5. Measurement and simulation of the cross sections for nuclide production in {sup nat}W and {sup 181}Ta targets irradiated with 0.04- to 2.6-GeV protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titarenko, Yu. E., E-mail: Yury.Titarenko@itep.ru; Batyaev, V. F.; Titarenko, A. Yu.

    The cross sections for nuclide production in thin {sup nat}Wand {sup 181}Ta targets irradiated by 0.04-2.6-GeV protons have been measured by direct {gamma} spectrometry using two {gamma} spectrometers with the resolutions of 1.8 and 1.7 keV in the {sup 60}Co 1332-keV {gamma} line. As a result, 1895 yields of radioactive residual product nuclei have been obtained. The {sup 27}Al(p, x){sup 22}Na reaction has been used as a monitor reaction. The experimental data have been compared with the MCNPX (BERTINI, ISABEL), CEM03.02, INCL4.2, INCL4.5, PHITS, and CASCADE07 calculations.

  6. Cross sections for nuclide production in proton- and deuteron-induced reactions on 93Nb measured using the inverse kinematics method

    NASA Astrophysics Data System (ADS)

    Nakano, Keita; Watanabe, Yukinobu; Kawase, Shoichiro; Wang, He; Otsu, Hideaki; Sakurai, Hiroyoshi; Takeuchi, Satoshi; Togano, Yasuhiro; Nakamura, Takashi; Maeda, Yukie; Ahn, Deuk Soon; Aikawa, Masayuki; Araki, Shouhei; Chen, Sidong; Chiga, Nobuyuki; Doornenbal, Pieter; Fukuda, Naoki; Ichihara, Takashi; Isobe, Tadaaki; Kawakami, Shunsuke; Kin, Tadahiro; Kondo, Yosuke; Koyama, Shunpei; Kubo, Toshiyuki; Kubono, Shigeru; Kurokawa, Meiko; Makinaga, Ayano; Matsushita, Masafumi; Matsuzaki, Teiichiro; Michimasa, Shin'ichiro; Momiyama, Satoru; Nagamine, Shunsuke; Niikura, Megumi; Ozaki, Tomoyuki; Saito, Atsumi; Saito, Takeshi; Shiga, Yoshiaki; Shikata, Mizuki; Shimizu, Yohei; Shimoura, Susumu; Sumikama, Toshiyuki; Söderström, Pär-Anders; Suzuki, Hiroshi; Takeda, Hiroyuki; Taniuchi, Ryo; Tsubota, Jun'ichi; Watanabe, Yasushi; Wimmer, Kathrin; Yamamoto, Tatsuya; Yoshida, Koichi

    2017-09-01

    Isotopic production cross sections were measured for proton- and deuteron-induced reactions on 93Nb by means of the inverse kinematics method at RIKEN Radioactive Isotope Beam Factory. The measured production cross sections of residual nuclei in the reaction 93Nb + p at 113 MeV/u were compared with previous data measured by the conventional activation method in the proton energy range between 46 and 249 MeV. The present inverse kinematics data of four reaction products (90Mo, 90Nb, 88Y, and 86Y) were in good agreement with the data of activation measurement. Also, the model calculations with PHITS describing the intra-nuclear cascade and evaporation processes generally well reproduced the measured isotopic production cross sections.

  7. Measurement and simulation of the cross sections for nuclide production in {sup 93}Nb and {sup nat}Ni targets irradiated with 0.04- to 2.6-GeV protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titarenko, Yu. E., E-mail: Yury.Titarenko@itep.ru; Batyaev, V. F.; Titarenko, A. Yu.

    The cross sections for nuclide production in thin {sup 93}Nb and {sup nat}Ni targets irradiated by 0.04- to 2.6-GeV protons have been measured by direct {gamma} spectrometry using two {gamma} spectrometers with the resolutions of 1.8 and 1.7 keV in the {sup 60}Co 1332-keV {gamma} line. As a result, 1112 yields of radioactive residual nuclei have been obtained. The {sup 27}Al(p, x){sup 22}Na reaction has been used as a monitor reaction. The experimental data have been compared with the MCNPX (BERTINI, ISABEL), CEM03.02, INCL4.2, INCL4.5, PHITS, and CASCADE07 calculations.

  8. Measurement and simulation of the cross sections for nuclide production in {sup 56}Fe and {sup nat}Cr targets irradiated with 0.04- to 2.6-GeV protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titarenko, Yu. E., E-mail: Yury.Titarenko@itep.ru; Batyaev, V. F.; Titarenko, A. Yu.

    The cross sections for nuclide production in thin {sup 56}Fe and {sup nat}Cr targets irradiated by 0.04-2.6-GeV protons are measured by direct {gamma} spectrometry using two {gamma} spectrometers with the resolutions of 1.8 and 1.7 keV for the {sup 60}Co 1332-keV {gamma} line. As a result, 649 yields of radioactive residual product nuclei have been obtained. The {sup 27}Al(p, x){sup 22}Na reaction has been used as a monitor reaction. The experimental data are compared with the MCNPX (BERTINI, ISABEL), CEM03.02, INCL4.2, INCL4.5, PHITS, and CASCADE07 calculations.

  9. An Ethnographically Informed Participatory Design of Primary Healthcare Information Technology in a Developing Country Setting.

    PubMed

    Shidende, Nima Herman; Igira, Faraja Teddy; Mörtberg, Christina Margaret

    2017-01-01

    Ethnography, with its emphasis on understanding activities where they occur, and its use of qualitative data gathering techniques rich in description, has a long tradition in Participatory Design (PD). Yet there are limited methodological insights in its application in developing countries. This paper proposes an ethnographically informed PD approach, which can be applied when designing Primary Healthcare Information Technology (PHIT). We use findings from a larger multidisciplinary project, Health Information Systems Project (HISP) to elaborate how ethnography can be used to facilitate participation of health practitioners in developing countries settings as well as indicating the importance of ethnographic approach to participatory Health Information Technology (HIT) designers. Furthermore, the paper discusses the pros and cons of using an ethnographic approach in designing HIT.

  10. A code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check codes

    NASA Astrophysics Data System (ADS)

    Bai, Cheng-lin; Cheng, Zhi-hui

    2016-09-01

    In order to further improve the carrier synchronization estimation range and accuracy at low signal-to-noise ratio ( SNR), this paper proposes a code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check (NB-LDPC) codes to study the polarization-division-multiplexing coherent optical orthogonal frequency division multiplexing (PDM-CO-OFDM) system performance in the cases of quadrature phase shift keying (QPSK) and 16 quadrature amplitude modulation (16-QAM) modes. The simulation results indicate that this algorithm can enlarge frequency and phase offset estimation ranges and enhance accuracy of the system greatly, and the bit error rate ( BER) performance of the system is improved effectively compared with that of the system employing traditional NB-LDPC code-aided carrier synchronization algorithm.

  11. Probability Quantization for Multiplication-Free Binary Arithmetic Coding

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.

  12. A simple clinical coding strategy to improve recording of child maltreatment concerns: an audit study.

    PubMed

    McGovern, Andrew Peter; Woodman, Jenny; Allister, Janice; van Vlymen, Jeremy; Liyanage, Harshana; Jones, Simon; Rafi, Imran; de Lusignan, Simon; Gilbert, Ruth

    2015-01-14

    Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE) but there is evidence of substantial under-recording. To determine whether a simple coding strategy improved recording of maltreatment-related concerns in electronic primary care records. Clinical audit of rates of maltreatment-related coding before January 2010-December 2011 and after January-December 2012 implementation of a simple coding strategy in 11 English family practices. The strategy included encouraging general practitioners to use, always and as a minimum, the Read code 'Child is cause for concern'. A total of 25,106 children aged 0-18 years were registered with these practices. We also undertook a qualitative service evaluation to investigate barriers to recording. Outcomes were recording of 1) any maltreatment-related codes, 2) child protection proceedings and 3) child was a cause for concern. We found increased recording of any maltreatment-related code (rate ratio 1.4; 95% CI 1.1-1.6), child protection procedures (RR 1.4; 95% CI 1.1-1.6) and cause for concern (RR 2.5; 95% CI 1.8-3.4) after implementation of the coding strategy. Clinicians cited the simplicity of the coding strategy as the most important factor assisting implementation. This simple coding strategy improved clinician's recording of maltreatment-related concerns in a small sample of practices with some 'buy-in'. Further research should investigate how recording can best support the doctor-patient relationship. HOW THIS FITS IN: Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE), but there is evidence of substantial under-recording. We describe a simple clinical coding strategy that helped general practitioners to improve recording of maltreatment-related concerns. These improvements could improve case finding of children at risk and information sharing.

  13. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less

  14. Optimization of Particle-in-Cell Codes on RISC Processors

    NASA Technical Reports Server (NTRS)

    Decyk, Viktor K.; Karmesin, Steve Roy; Boer, Aeint de; Liewer, Paulette C.

    1996-01-01

    General strategies are developed to optimize particle-cell-codes written in Fortran for RISC processors which are commonly used on massively parallel computers. These strategies include data reorganization to improve cache utilization and code reorganization to improve efficiency of arithmetic pipelines.

  15. Evaluation of three coding schemes designed for improved data communication

    NASA Technical Reports Server (NTRS)

    Snelsire, R. W.

    1974-01-01

    Three coding schemes designed for improved data communication are evaluated. Four block codes are evaluated relative to a quality function, which is a function of both the amount of data rejected and the error rate. The Viterbi maximum likelihood decoding algorithm as a decoding procedure is reviewed. This evaluation is obtained by simulating the system on a digital computer. Short constraint length rate 1/2 quick-look codes are studied, and their performance is compared to general nonsystematic codes.

  16. Automatic Data Traffic Control on DSM Architecture

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)

    2000-01-01

    We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.

  17. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  18. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  19. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.

    PubMed

    Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan

    2017-05-24

    The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  20. Improving coding accuracy in an academic practice.

    PubMed

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  1. Application discussion of source coding standard in voyage data recorder

    NASA Astrophysics Data System (ADS)

    Zong, Yonggang; Zhao, Xiandong

    2018-04-01

    This paper analyzes the disadvantages of the audio and video compression coding technology used by Voyage Data Recorder, and combines the improvement of performance of audio and video acquisition equipment. The thinking of improving the audio and video compression coding technology of the voyage data recorder is proposed, and the feasibility of adopting the new compression coding technology is analyzed from economy and technology two aspects.

  2. Leveraging Code Comments to Improve Software Reliability

    ERIC Educational Resources Information Center

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  3. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  4. Use, Assessment, and Improvement of the Loci-CHEM CFD Code for Simulation of Combustion in a Single Element GO2/GH2 Injector and Chamber

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Lin, Jeff; West, Jeff; Tucker, Kevin

    2006-01-01

    This document is a viewgraph presentation of a paper that documents a continuing effort at Marshall Space Flight Center (MSFC) to use, assess, and continually improve CFD codes to the point of material utility in the design of rocket engine combustion devices. This paper describes how the code is presently being used to simulate combustion in a single element combustion chamber with shear coaxial injectors using gaseous oxygen and gaseous hydrogen propellants. The ultimate purpose of the efforts documented is to assess and further improve the Loci-CHEM code and the implementation of it. Single element shear coaxial injectors were tested as part of the Staged Combustion Injector Technology (SCIT) program, where detailed chamber wall heat fluxes were measured. Data was taken over a range of chamber pressures for propellants injected at both ambient and elevated temperatures. Several test cases are simulated as part of the effort to demonstrate use of the Loci-CHEM CFD code and to enable us to make improvements in the code as needed. The simulations presented also include a grid independence study on hybrid grids. Several two-equation eddy viscosity low Reynolds number turbulence models are also evaluated as part of the study. All calculations are presented with a comparison to the experimental data. Weaknesses of the code relative to test data are discussed and continuing efforts to improve the code are presented.

  5. The optimal code searching method with an improved criterion of coded exposure for remote sensing image restoration

    NASA Astrophysics Data System (ADS)

    He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2015-03-01

    Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.

  6. A multicenter collaborative approach to reducing pediatric codes outside the ICU.

    PubMed

    Hayes, Leslie W; Dobyns, Emily L; DiGiovine, Bruno; Brown, Ann-Marie; Jacobson, Sharon; Randall, Kelly H; Wathen, Beth; Richard, Heather; Schwab, Carolyn; Duncan, Kathy D; Thrasher, Jodi; Logsdon, Tina R; Hall, Matthew; Markovitz, Barry

    2012-03-01

    The Child Health Corporation of America formed a multicenter collaborative to decrease the rate of pediatric codes outside the ICU by 50%, double the days between these events, and improve the patient safety culture scores by 5 percentage points. A multidisciplinary pediatric advisory panel developed a comprehensive change package of process improvement strategies and measures for tracking progress. Learning sessions, conference calls, and data submission facilitated collaborative group learning and implementation. Twenty Child Health Corporation of America hospitals participated in this 12-month improvement project. Each hospital identified at least 1 noncritical care target unit in which to implement selected elements of the change package. Strategies to improve prevention, detection, and correction of the deteriorating patient ranged from relatively simple, foundational changes to more complex, advanced changes. Each hospital selected a broad range of change package elements for implementation using rapid-cycle methodologies. The primary outcome measure was reduction in codes per 1000 patient days. Secondary outcomes were days between codes and change in patient safety culture scores. Code rate for the collaborative did not decrease significantly (3% decrease). Twelve hospitals reported additional data after the collaborative and saw significant improvement in code rates (24% decrease). Patient safety culture scores improved by 4.5% to 8.5%. A complex process, such as patient deterioration, requires sufficient time and effort to achieve improved outcomes and create a deeply embedded culture of patient safety. The collaborative model can accelerate improvements achieved by individual institutions.

  7. Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.

    PubMed

    Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan

    2016-09-01

    Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. FDA adverse Event Problem Codes: standardizing the classification of device and patient problems associated with medical device use.

    PubMed

    Reed, Terrie L; Kaufman-Rivi, Diana

    2010-01-01

    The broad array of medical devices and the potential for device failures, malfunctions, and other adverse events associated with each device creates a challenge for public health device surveillance programs. Coding reported events by type of device problem provides one method for identifying a potential signal of a larger device issue. The Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) Event Problem Codes that are used to report adverse events previously lacked a structured set of controls for code development and maintenance. Over time this led to inconsistent, ambiguous, and duplicative concepts being added to the code set on an ad-hoc basis. Recognizing the limitation of its coding system the FDA set out to update the system to improve its usefulness within FDA and as a basis of a global standard to identify important patient and device outcomes throughout the medical community. In 2004, FDA and the National Cancer Institute (NCI) signed a Memorandum of Understanding (MOU) whereby NCI agreed to provide terminology development and maintenance services to all FDA Centers. Under this MOU, CDRH's Office of Surveillance and Biometrics (OSB) convened a cross-Center workgroup and collaborated with staff at NCI Enterprise Vocabulary Service (EVS) to streamline the Patient and Device Problem Codes and integrate them into the NCI Thesaurus and Meta-Thesaurus. This initiative included many enhancements to the Event Problem Codes aimed at improving code selection as well as improving adverse event report analysis. LIMITATIONS & RECOMMENDATIONS: Staff resources, database concerns, and limited collaboration with external groups in the initial phases of the project are discussed. Adverse events associated with medical device use can be better understood when they are reported using a consistent and well-defined code set. This FDA initiative was an attempt to improve the structure and add control mechanisms to an existing code set, improve analysis tools that will better identify device safety trends, and improve the ability to prevent or mitigate effects of adverse events associated with medical device use.

  9. Comparing the coding of complications in Queensland and Victorian admitted patient data.

    PubMed

    Michel, Jude L; Cheng, Diana; Jackson, Terri J

    2011-08-01

    To examine differences between Queensland and Victorian coding of hospital-acquired conditions and suggest ways to improve the usefulness of these data in the monitoring of patient safety events. Secondary analysis of admitted patient episode data collected in Queensland and Victoria. Comparison of depth of coding, and patterns in the coding of ten commonly coded complications of five elective procedures. Comparison of the mean complication codes assigned per episode revealed Victoria assigns more valid codes than Queensland for all procedures, with the difference between the states being significantly different in all cases. The proportion of the codes flagged as complications was consistently lower for Queensland when comparing 10 common complications for each of the five selected elective procedures. The estimated complication rates for the five procedures showed Victoria to have an apparently higher complication rate than Queensland for 35 of the 50 complications examined. Our findings demonstrate that the coding of complications is more comprehensive in Victoria than in Queensland. It is known that inconsistencies exist between states in routine hospital data quality. Comparative use of patient safety indicators should be viewed with caution until standards are improved across Australia. More exploration of data quality issues is needed to identify areas for improvement.

  10. A multidisciplinary approach to vascular surgery procedure coding improves coding accuracy, work relative value unit assignment, and reimbursement.

    PubMed

    Aiello, Francesco A; Judelson, Dejah R; Messina, Louis M; Indes, Jeffrey; FitzGerald, Gordon; Doucet, Danielle R; Simons, Jessica P; Schanzer, Andres

    2016-08-01

    Vascular surgery procedural reimbursement depends on accurate procedural coding and documentation. Despite the critical importance of correct coding, there has been a paucity of research focused on the effect of direct physician involvement. We hypothesize that direct physician involvement in procedural coding will lead to improved coding accuracy, increased work relative value unit (wRVU) assignment, and increased physician reimbursement. This prospective observational cohort study evaluated procedural coding accuracy of fistulograms at an academic medical institution (January-June 2014). All fistulograms were coded by institutional coders (traditional coding) and by a single vascular surgeon whose codes were verified by two institution coders (multidisciplinary coding). The coding methods were compared, and differences were translated into revenue and wRVUs using the Medicare Physician Fee Schedule. Comparison between traditional and multidisciplinary coding was performed for three discrete study periods: baseline (period 1), after a coding education session for physicians and coders (period 2), and after a coding education session with implementation of an operative dictation template (period 3). The accuracy of surgeon operative dictations during each study period was also assessed. An external validation at a second academic institution was performed during period 1 to assess and compare coding accuracy. During period 1, traditional coding resulted in a 4.4% (P = .004) loss in reimbursement and a 5.4% (P = .01) loss in wRVUs compared with multidisciplinary coding. During period 2, no significant difference was found between traditional and multidisciplinary coding in reimbursement (1.3% loss; P = .24) or wRVUs (1.8% loss; P = .20). During period 3, traditional coding yielded a higher overall reimbursement (1.3% gain; P = .26) than multidisciplinary coding. This increase, however, was due to errors by institution coders, with six inappropriately used codes resulting in a higher overall reimbursement that was subsequently corrected. Assessment of physician documentation showed improvement, with decreased documentation errors at each period (11% vs 3.1% vs 0.6%; P = .02). Overall, between period 1 and period 3, multidisciplinary coding resulted in a significant increase in additional reimbursement ($17.63 per procedure; P = .004) and wRVUs (0.50 per procedure; P = .01). External validation at a second academic institution was performed to assess coding accuracy during period 1. Similar to institution 1, traditional coding revealed an 11% loss in reimbursement ($13,178 vs $14,630; P = .007) and a 12% loss in wRVU (293 vs 329; P = .01) compared with multidisciplinary coding. Physician involvement in the coding of endovascular procedures leads to improved procedural coding accuracy, increased wRVU assignments, and increased physician reimbursement. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  11. Applicability of Monte-Carlo Simulation to Equipment Design of Radioactive Noble Gas Monitor

    NASA Astrophysics Data System (ADS)

    Sakai, Hirotaka; Hattori, Kanako; Umemura, Norihiro

    In the nuclear facilities, radioactive noble gas is continuously monitored by using the radioactive noble gas monitor with beta-sensitive plastic scintillation radiation detector. The detection efficiency of the monitor is generally calibrated by using a calibration loop and standard radioactive noble gases such as 85Kr. In this study, the applicability of PHITS to the equipment design of the radioactive noble gas monitor was evaluated by comparing the calculated results to the test results obtained by actual calibration loop tests to simplify the radiation monitor design evaluation. It was confirmed that the calculated results were well matched to the test results of the monitor after the modeling. In addition, the key parameters for equipment design, such as thickness of detector window or depth of the sampler, were also specified and evaluated.

  12. Spallation reaction study for fission products in nuclear waste: Cross section measurements for 137Cs and 90Sr on proton and deuteron

    NASA Astrophysics Data System (ADS)

    Wang, H.; Otsu, H.; Sakurai, H.; Ahn, D. S.; Aikawa, M.; Doornenbal, P.; Fukuda, N.; Isobe, T.; Kawakami, S.; Koyama, S.; Kubo, T.; Kubono, S.; Lorusso, G.; Maeda, Y.; Makinaga, A.; Momiyama, S.; Nakano, K.; Niikura, M.; Shiga, Y.; Söderström, P.-A.; Suzuki, H.; Takeda, H.; Takeuchi, S.; Taniuchi, R.; Watanabe, Ya.; Watanabe, Yu.; Yamasaki, H.; Yoshida, K.

    2016-03-01

    We have studied spallation reactions for the fission products 137Cs and 90Sr for the purpose of nuclear waste transmutation. The spallation cross sections on the proton and deuteron were obtained in inverse kinematics for the first time using secondary beams of 137Cs and 90Sr at 185 MeV/nucleon at the RIKEN Radioactive Isotope Beam Factory. The target dependence has been investigated systematically, and the cross-section differences between the proton and deuteron are found to be larger for lighter spallation products. The experimental data are compared with the PHITS calculation, which includes cascade and evaporation processes. Our results suggest that both proton- and deuteron-induced spallation reactions are promising mechanisms for the transmutation of radioactive fission products.

  13. Quality Scalability Aware Watermarking for Visual Content.

    PubMed

    Bhowmik, Deepayan; Abhayaratne, Charith

    2016-11-01

    Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.

  14. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    PubMed

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  15. Improved Iterative Decoding of Network-Channel Codes for Multiple-Access Relay Channel.

    PubMed

    Majumder, Saikat; Verma, Shrish

    2015-01-01

    Cooperative communication using relay nodes is one of the most effective means of exploiting space diversity for low cost nodes in wireless network. In cooperative communication, users, besides communicating their own information, also relay the information of other users. In this paper we investigate a scheme where cooperation is achieved using a common relay node which performs network coding to provide space diversity for two information nodes transmitting to a base station. We propose a scheme which uses Reed-Solomon error correcting code for encoding the information bit at the user nodes and convolutional code as network code, instead of XOR based network coding. Based on this encoder, we propose iterative soft decoding of joint network-channel code by treating it as a concatenated Reed-Solomon convolutional code. Simulation results show significant improvement in performance compared to existing scheme based on compound codes.

  16. Comparison of the mean quality factors for astronauts calculated using the Q-functions proposed by ICRP, ICRU, and NASA

    NASA Astrophysics Data System (ADS)

    Sato, T.; Endo, A.; Niita, K.

    2013-07-01

    For the estimation of the radiation risk for astronauts, not only the organ absorbed doses but also their mean quality factors must be evaluated. Three functions have been proposed by different organizations for expressing the radiation quality, including the Q(L), Q(y), and QNASA(Z, E) relationships as defined in International Committee of Radiological Protection (ICRP) Publication 60, International Commission on Radiation Units and Measurements (ICRU) Report 40, and National Aeronautics and Space Administration (NASA) TP-2011-216155, respectively. The Q(L) relationship is the most simple and widely used for space dosimetry, but the use of the latter two functions enables consideration of the difference in the track structure of various charged particles during the risk estimation. Therefore, we calculated the mean quality factors in organs and tissues in ICRP/ICRU reference voxel phantoms for the isotropic exposure to various mono-energetic particles using the three Q-functions. The Particle and Heavy Ion Transport code System PHITS was employed to simulate the particle motions inside the phantoms. The effective dose equivalents and the phantom-averaged effective quality factors for the astronauts were then estimated from the calculated mean quality factors multiplied by the fluence-to-dose conversion coefficients and cosmic-ray fluxes inside a spacecraft. It was found from the calculations that QNASA generally gives the largest values for the phantom-averaged effective quality factors among the three Q-functions for neutron, proton, and lighter-ion irradiation, whereas Q(L) provides the largest values for heavier-ion irradiation. Overall, the introduction of QNASA instead of Q(L) or Q(y) in astronaut dosimetry results in the increase the effective dose equivalents because the majority of the doses are composed of the contributions from protons and neutrons, although this tendency may change by the calculation conditions.

  17. SU-E-T-132: Assess the Shielding of Secondary Neutrons From Patient Collimator in Proton Therapy Considering Secondary Photons Generated in the Shielding Process with Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamanaka, M; Takashina, M; Kurosu, K

    Purpose: In this study we present Monte Carlo based evaluation of the shielding effect for secondary neutrons from patient collimator, and secondary photons emitted in the process of neutron shielding by combination of moderator and boron-10 placed around patient collimator. Methods: The PHITS Monte Carlo Simulation radiation transport code was used to simulate the proton beam (Ep = 64 to 93 MeV) from a proton therapy facility. In this study, moderators (water, polyethylene and paraffin) and boron (pure {sup 10}B) were placed around patient collimator in this order. The rate of moderator and boron thicknesses was changed fixing the totalmore » thickness at 3cm. The secondary neutron and photons doses were evaluated as the ambient dose equivalent per absorbed dose [H*(10)/D]. Results: The secondary neutrons are shielded more effectively by combination moderators and boron. The most effective combination of shielding neutrons is the polyethylene of 2.4 cm thick and the boron of 0.6 cm thick and the maximum reduction rate is 47.3 %. The H*(10)/D of secondary photons in the control case is less than that of neutrons by two orders of magnitude and the maximum increase of secondary photons is 1.0 µSv/Gy with the polyethylene of 2.8 cm thick and the boron of 0.2 cm thick. Conclusion: The combination of moderators and boron is beneficial for shielding secondary neutrons. Both the secondary photons of control and those emitted in the shielding neutrons are very lower than the secondary neutrons and photon has low RBE in comparison with neutron. Therefore the secondary photons can be ignored in the shielding neutrons.This work was supported by JSPS Core-to-Core Program (No.23003). This work was supported by JSPS Core-to-Core Program (No.23003)« less

  18. Prediction of the solar modulation of galactic cosmic rays and radiation dose of aircrews up to the solar cycle 26

    NASA Astrophysics Data System (ADS)

    Miyake, S.; Kataoka, R.; Sato, T.

    2016-12-01

    The solar modulation of galactic cosmic rays (GCRs), which is the variation of the terrestrial GCR flux caused by the heliospheric environmental change, is basically anti-correlated with the solar activity with so-called 11-year periodicity. In the current weak solar cycle 24, we expect that the flux of GCRs is getting higher than that in the previous solar cycles, leading to the increase in the radiation exposure in the space and atmosphere. In order to quantitatively evaluate the possible solar modulation of GCRs and resultant radiation exposure at flight altitude during the solar cycles 24, 25, and 26, we have developed the time-dependent and three-dimensional model of the solar modulation of GCRs. Our model can give the flux of GCRs anywhere in the heliosphere by assuming the variation of the solar wind velocity, the strength of the interplanetary magnetic field, and its tilt angle. We solve the curvature and gradient drift motion of GCRs in the heliospheric magnetic field, and therefore reproduce the 22-year variation of the solar modulation of GCRs. It is quantitatively confirmed that our model reproduces the energy spectra observed by BESS and PAMELA. We then calculate the variation of the GCR energy spectra during the solar cycles 24, 25, and 26, by extrapolating the solar wind parameters and tilt angle. We also calculate the neutron monitor counting rate and the radiation dose of aircrews at flight altitude, by the air-shower simulation performed by PHITS (Particle and Heavy Ion Transport code System). In this presentation, we report the quantitative forecast values of the solar modulation of GCRs, neutron monitor counting rate, and the radiation dose at flight altitude up to the cycle 26, including the discussion of the charge sign dependence on those results.

  19. SU-F-T-129: Impact of Radial Fluctuations in RBE for Therapeutic Proton Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butkus, M; Palmer, T

    Purpose: To evaluate the off axis relative biological effectiveness (RBE) for actively scanned proton beams and determine if a constant radial RBE can be assumed. Methods: The PHITS Monte Carlo code paired with a microscopic analytical function was used to determine probability distribution functions of the lineal energy in 0.3µm diameter spheres throughout a water phantom. Twenty million primary protons were simulated for a 0.6cm diameter pencil beam. Beam energies corresponding to Bragg Peak depths of 50, 100, 150, 200, 250, and 300mm were used and evaluated transversely every millimeter and radially for annuli of 1.0, 2.0, 3.0, 3.2, 3.4,more » 3.6, 4.0, 5.0, 10.0, 15.0, 20.0 and 25.0mm outer radius. The acquired probability distributions were reduced to dose-mean lineal energies and applied to the modified microdosimetric kinetic model, for human submandibular gland (HSG) cells, to calculate relative biological effectiveness (RBE) compared to 60Co beams at the 10% survival threshold. Results: RBE was generally seen to increase as distance from the central axis (CAX) increased. However, this increase was only seen in low dose regions and its overall effects on the transverse biological dose remains low. In the entrance region of the phantom (10mm depth), minimum and maximum calculated RBEs varied between 15.22 and 18.88% for different energies. At the Bragg peak, this difference ranged from 3.15 to 26.77%. Despite these rather large variations the dose-weighted RBE and the CAX RBE varied by less than 0.14% at 10mm depth and less than 0.16% at the Bragg peak. Similarly small variations were found at all depths proximal of the Bragg peak. Conclusion: Although proton RBE does vary radially, its overall effect on biological dose is minimal and the use of a radially constant RBE in treatment planning for scanned proton beams would not produce large errors.« less

  20. Characteristics of proton beams and secondary neutrons arising from two different beam nozzles

    NASA Astrophysics Data System (ADS)

    Choi, Yeon-Gyeong; Kim, Yu-Seok

    2015-10-01

    A tandem or a Van de Graaff accelerator with an energy of 3 MeV is typically used for Proton Induced X-ray Emission (PIXE) analysis. In this study, the beam line design used in the PIXE analysis, instead of the typical low-energy accelerator, was used to increase the production of isotopes from a 13-MeV cyclotron. For the PIXE analysis, the proton beam should be focused at the target through a nozzle after degrading the proton beams energy from 13 MeV to 3 MeV by using an energy degrader. Previous studies have been conducted to determine the most appropriate material for and the thickness of the energy degrader. From the energy distribution of the degraded proton beam and the neutron occurrence rate at the degrader, an aluminum nozzle of X thickness was determined to be the most appropriate nozzle construction. Neutrons are created by the collision of 3-MeV protons in the nozzle after passage through the energy degrader. In addition, a proton beam of sufficient intensity is required for a non-destructive PIXE analysis. Therefore, if nozzle design is to be optimized, the number of neutrons that arise from the collision of protons inside the nozzle, as well as the track direction of the generated secondary neutrons, must be considered, with the primary aim of ensuring that a sufficient number of protons pass through the nozzle as a direct beam. A number of laboratories are currently conducting research related to the design of nozzles used in accelerator fields, mostly medical fields. This paper presents a comparative analysis of two typical nozzle shapes in order to minimize the loss of protons and the generation of secondary neutrons. The neutron occurrence rate and the number of protons that pass through the nozzle were analyzed by using a Particle and Heavy Ion Transport code System (PHITS) program in order to identify the nozzle that generated the strongest proton beam.

  1. Does a colour-coded blood pressure diary improve blood pressure control for patients in general practice: the CoCo trial.

    PubMed

    Steurer-Stey, Claudia; Zoller, Marco; Chmiel Moshinsky, Corinne; Senn, Oliver; Rosemann, Thomas

    2010-04-14

    Insufficient blood pressure control is a frequent problem despite the existence of effective treatment. Insufficient adherence to self-monitoring as well as to therapy is a common reason. Blood pressure self-measurement at home (Home Blood Pressure Measurement, HBPM) has positive effects on treatment adherence and is helpful in achieving the target blood pressure. Only a few studies have investigated whether adherence to HBPM can be improved through simple measures resulting also in better blood pressure control. Improvement of self-monitoring and improved blood pressure control by using a new colour-coded blood pressure diary. Change in systolic and/or diastolic blood pressure 6 months after using the new colour-coded blood pressure diary.Secondary outcome: Adherence to blood pressure self-measurement (number of measurements/entries). Randomised controlled study. 138 adult patients in primary care with uncontrolled hypertension despite therapy. The control group uses a conventional blood pressure diary; the intervention group uses the new colour-coded blood pressure diary (green, yellow, red according a traffic light system). EXPECTED RESULTS/CONCLUSION: The visual separation and entries in three colour-coded areas reflecting risk (green: blood pressure in the target range 140/>90 mmHg, red: blood pressure in danger zone > 180 mmHg/>110 mmHg) lead to better self-monitoring compared with the conventional (non-colour-coded) blood pressure booklet. The colour-coded, visualised information supports improved perception (awareness and interpretation) of blood pressure and triggers correct behaviour, in the means of improved adherence to the recommended treatment as well as better communication between patients and doctors resulting in improved blood pressure control. ClinicalTrials.gov ID NCT01013467.

  2. HEC Applications on Columbia Project

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2004-01-01

    NASA's Columbia system consists of a cluster of twenty 512 processor SGI Altix systems. Each of these systems is 3 TFLOP/s in peak performance - approximately the same as the entire compute capability at NAS just one year ago. Each 512p system is a single system image machine with one Linunx O5, one high performance file system, and one globally shared memory. The NAS Terascale Applications Group (TAG) is chartered to assist in scaling NASA's mission critical codes to at least 512p in order to significantly improve emergency response during flight operations, as well as provide significant improvements in the codes. and rate of scientific discovery across the scientifc disciplines within NASA's Missions. Recent accomplishments are 4x improvements to codes in the ocean modeling community, 10x performance improvements in a number of computational fluid dynamics codes used in aero-vehicle design, and 5x improvements in a number of space science codes dealing in extreme physics. The TAG group will continue its scaling work to 2048p and beyond (10240 cpus) as the Columbia system becomes fully operational and the upgrades to the SGI NUMAlink memory fabric are in place. The NUMlink uprades dramatically improve system scalability for a single application. These upgrades will allow a number of codes to execute faster at higher fidelity than ever before on any other system, thus increasing the rate of scientific discovery even further

  3. Evaluation of large girth LDPC codes for PMD compensation by turbo equalization.

    PubMed

    Minkov, Lyubomir L; Djordjevic, Ivan B; Xu, Lei; Wang, Ting; Kueppers, Franko

    2008-08-18

    Large-girth quasi-cyclic LDPC codes have been experimentally evaluated for use in PMD compensation by turbo equalization for a 10 Gb/s NRZ optical transmission system, and observing one sample per bit. Net effective coding gain improvement for girth-10, rate 0.906 code of length 11936 over maximum a posteriori probability (MAP) detector for differential group delay of 125 ps is 6.25 dB at BER of 10(-6). Girth-10 LDPC code of rate 0.8 outperforms the girth-10 code of rate 0.906 by 2.75 dB, and provides the net effective coding gain improvement of 9 dB at the same BER. It is experimentally determined that girth-10 LDPC codes of length around 15000 approach channel capacity limit within 1.25 dB.

  4. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  5. Deep Learning Methods for Improved Decoding of Linear Codes

    NASA Astrophysics Data System (ADS)

    Nachmani, Eliya; Marciano, Elad; Lugosch, Loren; Gross, Warren J.; Burshtein, David; Be'ery, Yair

    2018-02-01

    The problem of low complexity, close to optimal, channel decoding of linear codes with short to moderate block length is considered. It is shown that deep learning methods can be used to improve a standard belief propagation decoder, despite the large example space. Similar improvements are obtained for the min-sum algorithm. It is also shown that tying the parameters of the decoders across iterations, so as to form a recurrent neural network architecture, can be implemented with comparable results. The advantage is that significantly less parameters are required. We also introduce a recurrent neural decoder architecture based on the method of successive relaxation. Improvements over standard belief propagation are also observed on sparser Tanner graph representations of the codes. Furthermore, we demonstrate that the neural belief propagation decoder can be used to improve the performance, or alternatively reduce the computational complexity, of a close to optimal decoder of short BCH codes.

  6. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  7. Improving the sensitivity and specificity of the abbreviated injury scale coding system.

    PubMed Central

    Kramer, C F; Barancik, J I; Thode, H C

    1990-01-01

    The Abbreviated Injury Scale with Epidemiologic Modifications (AIS 85-EM) was developed to make it possible to code information about anatomic injury types and locations that, although generally available from medical records, is not codable under the standard Abbreviated Injury Scale, published by the American Association for Automotive Medicine in 1985 (AIS 85). In a population-based sample of 3,223 motor vehicle trauma cases, 68 percent of the patients had one or more injuries that were coded to the AIS 85 body region nonspecific category external. When the same patients' injuries were coded using the AIS 85-EM coding procedure, only 15 percent of the patients had injuries that could not be coded to a specific body region. With AIS 85-EM, the proportion of codable head injury cases increased from 16 percent to 37 percent, thereby improving the potential for identifying cases with head and threshold brain injury. The data suggest that body region coding of all injuries is necessary to draw valid and reliable conclusions about changes in injury patterns and their sequelae. The increased specificity of body region coding improves assessments of the efficacy of injury intervention strategies and countermeasure programs using epidemiologic methodology. PMID:2116633

  8. Enhancing Scalability and Efficiency of the TOUGH2_MP for LinuxClusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Wu, Yu-Shu

    2006-04-17

    TOUGH2{_}MP, the parallel version TOUGH2 code, has been enhanced by implementing more efficient communication schemes. This enhancement is achieved through reducing the amount of small-size messages and the volume of large messages. The message exchange speed is further improved by using non-blocking communications for both linear and nonlinear iterations. In addition, we have modified the AZTEC parallel linear-equation solver to nonblocking communication. Through the improvement of code structuring and bug fixing, the new version code is now more stable, while demonstrating similar or even better nonlinear iteration converging speed than the original TOUGH2 code. As a result, the new versionmore » of TOUGH2{_}MP is improved significantly in its efficiency. In this paper, the scalability and efficiency of the parallel code are demonstrated by solving two large-scale problems. The testing results indicate that speedup of the code may depend on both problem size and complexity. In general, the code has excellent scalability in memory requirement as well as computing time.« less

  9. Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort

    NASA Technical Reports Server (NTRS)

    Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David

    2002-01-01

    A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this

  10. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  11. Side information in coded aperture compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.

    2017-02-01

    Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.

  12. Throughput of Coded Optical CDMA Systems with AND Detectors

    NASA Astrophysics Data System (ADS)

    Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.

    2012-09-01

    Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.

  13. Extensions and improvements on XTRAN3S

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.

  14. How do primary care doctors in England and Wales code and manage people with chronic kidney disease? Results from the National Chronic Kidney Disease Audit.

    PubMed

    Kim, Lois G; Cleary, Faye; Wheeler, David C; Caplin, Ben; Nitsch, Dorothea; Hull, Sally A

    2017-10-16

    In the UK, primary care records are electronic and require doctors to ascribe disease codes to direct care plans and facilitate safe prescribing. We investigated factors associated with coding of chronic kidney disease (CKD) in patients with reduced kidney function and the impact this has on patient management. We identified patients meeting biochemical criteria for CKD (two estimated glomerular filtration rates <60 mL/min/1.73 m2 taken >90 days apart) from 1039 general practitioner (GP) practices in a UK audit. Clustered logistic regression was used to identify factors associated with coding for CKD and improvement in coding as a result of the audit process. We investigated the relationship between coding and five interventions recommended for CKD: achieving blood pressure targets, proteinuria testing, statin prescription and flu and pneumococcal vaccination. Of 256 000 patients with biochemical CKD, 30% did not have a GP CKD code. Males, older patients, those with more severe CKD, diabetes or hypertension or those prescribed statins were more likely to have a CKD code. Among those with continued biochemical CKD following audit, these same characteristics increased the odds of improved coding. Patients without any kidney diagnosis were less likely to receive optimal care than those coded for CKD [e.g. odds ratio for meeting blood pressure target 0.78 (95% confidence interval 0.76-0.79)]. Older age, male sex, diabetes and hypertension are associated with coding for those with biochemical CKD. CKD coding is associated with receiving key primary care interventions recommended for CKD. Increased efforts to incentivize CKD coding may improve outcomes for CKD patients. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA.

  15. An international survey of building energy codes and their implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy demand from buildings. Access to benefits of building energy codes depends on comprehensive coverage of buildings by type, age, size, andmore » geographic location; an implementation framework that involves a certified agency to inspect construction at critical stages; and independently tested, rated, and labeled building energy materials. Training and supporting tools are another element of successful code implementation, and their role is growing in importance, given the increasing flexibility and complexity of building energy codes. Some countries have also introduced compliance evaluation and compliance checking protocols to improve implementation. This article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  16. Performance of data-compression codes in channels with errors. Final report, October 1986-January 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-10-01

    Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.

  17. Improved convolutional coding

    NASA Technical Reports Server (NTRS)

    Doland, G. D.

    1970-01-01

    Convolutional coding, used to upgrade digital data transmission under adverse signal conditions, has been improved by a method which ensures data transitions, permitting bit synchronizer operation at lower signal levels. Method also increases decoding ability by removing ambiguous condition.

  18. Photoionization and High Density Gas

    NASA Technical Reports Server (NTRS)

    Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.

  19. Engine dynamic analysis with general nonlinear finite element codes. Part 2: Bearing element implementation overall numerical characteristics and benchmaking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Fertis, J.; Zeid, I.; Lam, P.

    1982-01-01

    Finite element codes are used in modelling rotor-bearing-stator structure common to the turbine industry. Engine dynamic simulation is used by developing strategies which enable the use of available finite element codes. benchmarking the elements developed are benchmarked by incorporation into a general purpose code (ADINA); the numerical characteristics of finite element type rotor-bearing-stator simulations are evaluated through the use of various types of explicit/implicit numerical integration operators. Improving the overall numerical efficiency of the procedure is improved.

  20. Coding for spread spectrum packet radios

    NASA Technical Reports Server (NTRS)

    Omura, J. K.

    1980-01-01

    Packet radios are often expected to operate in a radio communication network environment where there tends to be man made interference signals. To combat such interference, spread spectrum waveforms are being considered for some applications. The use of convolutional coding with Viterbi decoding to further improve the performance of spread spectrum packet radios is examined. At 0.00001 bit error rates, improvements in performance of 4 db to 5 db can easily be achieved with such coding without any change in data rate nor spread spectrum bandwidth. This coding gain is more dramatic in an interference environment.

  1. Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, M.; Cuervo, D.; Ivanov, K.

    2006-07-01

    The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)

  2. Study of proton- and deuteron-induced spallation reactions on the long-lived fission product 93Zr at 105 MeV/nucleon in inverse kinematics

    NASA Astrophysics Data System (ADS)

    Kawase, Shoichiro; Nakano, Keita; Watanabe, Yukinobu; Wang, He; Otsu, Hideaki; Sakurai, Hiroyoshi; Ahn, Deuk Soon; Aikawa, Masayuki; Ando, Takashi; Araki, Shouhei; Chen, Sidong; Chiga, Nobuyuki; Doornenbal, Pieter; Fukuda, Naoki; Isobe, Tadaaki; Kawakami, Shunsuke; Kin, Tadahiro; Kondo, Yosuke; Koyama, Shunpei; Kubono, Shigeru; Maeda, Yukie; Makinaga, Ayano; Matsushita, Masafumi; Matsuzaki, Teiichiro; Michimasa, Shin'ichiro; Momiyama, Satoru; Nagamine, Shunsuke; Nakamura, Takashi; Niikura, Megumi; Ozaki, Tomoyuki; Saito, Atsumi; Saito, Takeshi; Shiga, Yoshiaki; Shikata, Mizuki; Shimizu, Yohei; Shimoura, Susumu; Sumikama, Toshiyuki; Söderström, Pär-Anders; Suzuki, Hiroshi; Takeda, Hiroyuki; Takeuchi, Satoshi; Taniuchi, Ryo; Togano, Yasuhiro; Tsubota, Jun'ichi; Uesaka, Meiko; Watanabe, Yasushi; Wimmer, Kathrin; Yamamoto, Tatsuya; Yoshida, Koichi

    2017-09-01

    Spallation reactions for the long-lived fission product ^{93}Zr have been studied in order to provide basic data necessary for nuclear waste transmutation. Isotopic-production cross sections via proton- and deuteron-induced spallation reactions on ^{93}Zr at 105 MeV/nucleon were measured in inverse kinematics at the RIKEN Radioactive Isotope Beam Factory. Remarkable jumps in isotopic production originating from the neutron magic number N=50 were observed in Zr and Y isotopes. The experimental results were compared to the PHITS calculations considering both the intranuclear cascade and evaporation processes, and the calculations greatly overestimated the measured production yield, corresponding to few-nucleon-removal reactions. The present data suggest that the spallation reaction is a potential candidate for the treatment of ^{93}Zr in spent nuclear fuel.

  3. Analysis of Soft Error Rates in 65- and 28-nm FD-SOI Processes Depending on BOX Region Thickness and Body Bias by Monte-Carlo Based Simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Kuiyuan; Umehara, Shigehiro; Yamaguchi, Junki; Furuta, Jun; Kobayashi, Kazutoshi

    2016-08-01

    This paper analyzes how body bias and BOX region thickness affect soft error rates in 65-nm SOTB (Silicon on Thin BOX) and 28-nm UTBB (Ultra Thin Body and BOX) FD-SOI processes. Soft errors are induced by alpha-particle and neutron irradiation and the results are then analyzed by Monte Carlo based simulation using PHITS-TCAD. The alpha-particle-induced single event upset (SEU) cross-section and neutron-induced soft error rate (SER) obtained by simulation are consistent with measurement results. We clarify that SERs decreased in response to an increase in the BOX thickness for SOTB while SERs in UTBB are independent of BOX thickness. We also discover SOTB develops a higher tolerance to soft errors when reverse body bias is applied while UTBB become more susceptible.

  4. New shielding material development for compact accelerator-driven neutron source

    NASA Astrophysics Data System (ADS)

    Hu, Guang; Hu, Huasi; Wang, Sheng; Han, Hetong; Otake, Y.; Pan, Ziheng; Taketani, A.; Ota, H.; Hashiguchi, Takao; Yan, Mingfei

    2017-04-01

    The Compact Accelerator-driven Neutron Source (CANS), especially the transportable neutron source is longing for high effectiveness shielding material. For this reason, new shielding material is researched in this investigation. The component of shielding material is designed and many samples are manufactured. Then the attenuation detection experiments were carried out. In the detections, the dead time of the detector appeases when the proton beam is too strong. To grasp the linear range and nonlinear range of the detector, two currents of proton are employed in Pb attenuation detections. The transmission ratio of new shielding material, polyethylene (PE), PE + Pb, BPE + Pb is detected under suitable current of proton. Since the results of experimental neutrons and γ-rays appear as together, the MCNP and PHITS simulations are applied to assisting the analysis. The new shielding material could reduce of the weight and volume compared with BPE + Pb and PE + Pb.

  5. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  6. A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Divsalar, D.

    2001-01-01

    Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.

  7. Exploring a QoS Driven Scheduling Approach for Peer-to-Peer Live Streaming Systems with Network Coding

    PubMed Central

    Cui, Laizhong; Lu, Nan; Chen, Fu

    2014-01-01

    Most large-scale peer-to-peer (P2P) live streaming systems use mesh to organize peers and leverage pull scheduling to transmit packets for providing robustness in dynamic environment. The pull scheduling brings large packet delay. Network coding makes the push scheduling feasible in mesh P2P live streaming and improves the efficiency. However, it may also introduce some extra delays and coding computational overhead. To improve the packet delay, streaming quality, and coding overhead, in this paper are as follows. we propose a QoS driven push scheduling approach. The main contributions of this paper are: (i) We introduce a new network coding method to increase the content diversity and reduce the complexity of scheduling; (ii) we formulate the push scheduling as an optimization problem and transform it to a min-cost flow problem for solving it in polynomial time; (iii) we propose a push scheduling algorithm to reduce the coding overhead and do extensive experiments to validate the effectiveness of our approach. Compared with previous approaches, the simulation results demonstrate that packet delay, continuity index, and coding ratio of our system can be significantly improved, especially in dynamic environments. PMID:25114968

  8. Surface code implementation of block code state distillation.

    PubMed

    Fowler, Austin G; Devitt, Simon J; Jones, Cody

    2013-01-01

    State distillation is the process of taking a number of imperfect copies of a particular quantum state and producing fewer better copies. Until recently, the lowest overhead method of distilling states produced a single improved [formula: see text] state given 15 input copies. New block code state distillation methods can produce k improved [formula: see text] states given 3k + 8 input copies, potentially significantly reducing the overhead associated with state distillation. We construct an explicit surface code implementation of block code state distillation and quantitatively compare the overhead of this approach to the old. We find that, using the best available techniques, for parameters of practical interest, block code state distillation does not always lead to lower overhead, and, when it does, the overhead reduction is typically less than a factor of three.

  9. Surface code implementation of block code state distillation

    PubMed Central

    Fowler, Austin G.; Devitt, Simon J.; Jones, Cody

    2013-01-01

    State distillation is the process of taking a number of imperfect copies of a particular quantum state and producing fewer better copies. Until recently, the lowest overhead method of distilling states produced a single improved |A〉 state given 15 input copies. New block code state distillation methods can produce k improved |A〉 states given 3k + 8 input copies, potentially significantly reducing the overhead associated with state distillation. We construct an explicit surface code implementation of block code state distillation and quantitatively compare the overhead of this approach to the old. We find that, using the best available techniques, for parameters of practical interest, block code state distillation does not always lead to lower overhead, and, when it does, the overhead reduction is typically less than a factor of three. PMID:23736868

  10. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caruso, R.

    1997-07-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less

  11. Xylose utilizing Zymomonas mobilis with improved ethanol production in biomass hydrolysate medium

    DOEpatents

    Caimi, Perry G; Hitz, William D; Viitanen, Paul V; Stieglitz, Barry

    2013-10-29

    Xylose-utilizing, ethanol producing strains of Zymomonas mobilis with improved performance in medium comprising biomass hydrolysate were isolated using an adaptation process. Independently isolated strains were found to have independent mutations in the same coding region. Mutation in this coding may be engineered to confer the improved phenotype.

  12. Xylose utilizing zymomonas mobilis with improved ethanol production in biomass hydrolysate medium

    DOEpatents

    Caimi, Perry G; Hitz, William D; Stieglitz, Barry; Viitanen, Paul V

    2013-07-02

    Xylose-utilizing, ethanol producing strains of Zymomonas mobilis with improved performance in medium comprising biomass hydrolysate were isolated using an adaptation process. Independently isolated strains were found to have independent mutations in the same coding region. Mutation in this coding may be engineered to confer the improved phenotype.

  13. Sensitivity of Claims-Based Algorithms to Ascertain Smoking Status More Than Doubled with Meaningful Use.

    PubMed

    Huo, Jinhai; Yang, Ming; Tina Shih, Ya-Chen

    2018-03-01

    The "meaningful use of certified electronic health record" policy requires eligible professionals to record smoking status for more than 50% of all individuals aged 13 years or older in 2011 to 2012. To explore whether the coding to document smoking behavior has increased over time and to assess the accuracy of smoking-related diagnosis and procedure codes in identifying previous and current smokers. We conducted an observational study with 5,423,880 enrollees from the year 2009 to 2014 in the Truven Health Analytics database. Temporal trends of smoking coding, sensitivity, specificity, positive predictive value, and negative predictive value were measured. The rate of coding of smoking behavior improved significantly by the end of the study period. The proportion of patients in the claims data recorded as current smokers increased 2.3-fold and the proportion of patients recorded as previous smokers increased 4-fold during the 6-year period. The sensitivity of each International Classification of Diseases, Ninth Revision, Clinical Modification code was generally less than 10%. The diagnosis code of tobacco use disorder (305.1X) was the most sensitive code (9.3%) for identifying smokers. The specificities of these codes and the Current Procedural Terminology codes were all more than 98%. A large improvement in the coding of current and previous smoking behavior has occurred since the inception of the meaningful use policy. Nevertheless, the use of diagnosis and procedure codes to identify smoking behavior in administrative data is still unreliable. This suggests that quality improvements toward medical coding on smoking behavior are needed to enhance the capability of claims data for smoking-related outcomes research. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tippayakul, C.; Ivanov, K.; Misu, S.

    2006-07-01

    This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross sectionmore » library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)« less

  15. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  16. Local statistics adaptive entropy coding method for the improvement of H.26L VLC coding

    NASA Astrophysics Data System (ADS)

    Yoo, Kook-yeol; Kim, Jong D.; Choi, Byung-Sun; Lee, Yung Lyul

    2000-05-01

    In this paper, we propose an adaptive entropy coding method to improve the VLC coding efficiency of H.26L TML-1 codec. First of all, we will show that the VLC coding presented in TML-1 does not satisfy the sibling property of entropy coding. Then, we will modify the coding method into the local statistics adaptive one to satisfy the property. The proposed method based on the local symbol statistics dynamically changes the mapping relationship between symbol and bit pattern in the VLC table according to sibling property. Note that the codewords in the VLC table of TML-1 codec is not changed. Since this changed mapping relationship also derived in the decoder side by using the decoded symbols, the proposed VLC coding method does not require any overhead information. The simulation results show that the proposed method gives about 30% and 37% reduction in average bit rate for MB type and CBP information, respectively.

  17. Accuracy of clinical coding for procedures in oral and maxillofacial surgery.

    PubMed

    Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I

    2016-10-01

    Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. 75 FR 46903 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... Treatment (Code 521D), Pond Sealing or Lining--Soil Dispersant Treatment (Code 521B), Salinity and Sodic Soil Management (Code 610), Stream Habitat Improvement and Management (Code 395), Vertical Drain (Code... the criteria section; an expansion of the considerations section to include fish and wildlife and soil...

  19. Speech coding at low to medium bit rates

    NASA Astrophysics Data System (ADS)

    Leblanc, Wilfred Paul

    1992-09-01

    Improved search techniques coupled with improved codebook design methodologies are proposed to improve the performance of conventional code-excited linear predictive coders for speech. Improved methods for quantizing the short term filter are developed by employing a tree search algorithm and joint codebook design to multistage vector quantization. Joint codebook design procedures are developed to design locally optimal multistage codebooks. Weighting during centroid computation is introduced to improve the outlier performance of the multistage vector quantizer. Multistage vector quantization is shown to be both robust against input characteristics and in the presence of channel errors. Spectral distortions of about 1 dB are obtained at rates of 22-28 bits/frame. Structured codebook design procedures for excitation in code-excited linear predictive coders are compared to general codebook design procedures. Little is lost using significant structure in the excitation codebooks while greatly reducing the search complexity. Sparse multistage configurations are proposed for reducing computational complexity and memory size. Improved search procedures are applied to code-excited linear prediction which attempt joint optimization of the short term filter, the adaptive codebook, and the excitation. Improvements in signal to noise ratio of 1-2 dB are realized in practice.

  20. Coding in Stroke and Other Cerebrovascular Diseases.

    PubMed

    Korb, Pearce J; Jones, William

    2017-02-01

    Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.

  1. GAMERA - The New Magnetospheric Code

    NASA Astrophysics Data System (ADS)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  2. Coding For Compression Of Low-Entropy Data

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  3. Potential of coded excitation in medical ultrasound imaging.

    PubMed

    Misaridis, T X; Gammelmark, K; Jørgensen, C H; Lindberg, N; Thomsen, A H; Pedersen, M H; Jensen, J A

    2000-03-01

    Improvement in signal-to-noise ratio (SNR) and/or penetration depth can be achieved in medical ultrasound by using long coded waveforms, in a similar manner as in radars or sonars. However, the time-bandwidth product (TB) improvement, and thereby SNR improvement is considerably lower in medical ultrasound, due to the lower available bandwidth. There is still space for about 20 dB improvement in the SNR, which will yield a penetration depth up to 20 cm at 5 MHz [M. O'Donnell, IEEE Trans. Ultrason. Ferroelectr. Freq. Contr., 39(3) (1992) 341]. The limited TB additionally yields unacceptably high range sidelobes. However, the frequency weighting from the ultrasonic transducer's bandwidth, although suboptimal, can be beneficial in sidelobe reduction. The purpose of this study is an experimental evaluation of the above considerations in a coded excitation ultrasound system. A coded excitation system based on a modified commercial scanner is presented. A predistorted FM signal is proposed in order to keep the resulting range sidelobes at acceptably low levels. The effect of the transducer is taken into account in the design of the compression filter. Intensity levels have been considered and simulations on the expected improvement in SNR are also presented. Images of a wire phantom and clinical images have been taken with the coded system. The images show a significant improvement in penetration depth and they preserve both axial resolution and contrast.

  4. Improved Algorithms Speed It Up for Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A

    2005-09-20

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less

  5. Incorporation of Dynamic SSI Effects in the Design Response Spectra

    NASA Astrophysics Data System (ADS)

    Manjula, N. K.; Pillai, T. M. Madhavan; Nagarajan, Praveen; Reshma, K. K.

    2018-05-01

    Many studies in the past on dynamic soil-structure interactions have revealed the detrimental and advantageous effects of soil flexibility. Based on such studies, the design response spectra of international seismic codes are being improved worldwide. The improvements required for the short period range of the design response spectra in the Indian seismic code (IS 1893:2002) are presented in this paper. As the recent code revisions has not incorporated the short period amplifications, proposals given in this paper are equally applicable for the latest code also (IS 1893:2016). Analyses of single degree of freedom systems are performed to predict the required improvements. The proposed modifications to the constant acceleration portion of the spectra are evaluated with respect to the current design spectra in Eurocode 8.

  6. Structural and photophysical considerations of singlet fission organic thin films for solar photochemistry

    NASA Astrophysics Data System (ADS)

    Ryerson, Joseph L.

    Singlet fission (SF) is a multichromophore charge multiplication process in organic systems in which a singlet exciton shares its energy with a neighboring chromophore, thus generating two triplet excitons from one photon. SF chromophores can boost photocurrent in solar cells, raising the maximum theoretical power conversion efficiency of a single-junction solar cell from ˜33% to ˜45. Thin film (TF) preparation techniques, steady-state and time-resolved spectroscopic methods, and numerous advanced calculations were used to study the three systems presented here, all of which exhibit polymorphism. TFs of 1,3-diphenylisobenzofuran (1), were prepared and two polymorphs, alpha1 and beta-1, were discovered and characterized. alpha-1films exhibit phiTnear 200% and low phiF, whereas the dominant photophysical processes in the beta-1 polymorph are prompt and excimer emissions, with phi T around 10%. Absorption fitting revealed that the S1 state of beta-1 is lower than alpha-1, and therefore SF and the correlated triplet 1(TT) is energetically inaccessible to beta-1. The SF mechanism in TFs of each polymorph is outlined in great detail. Polymorphism in tetracene (Tc), a near 200% phiT SF material, has been previously documented, although morphology considerations have been neglected. While crystallite size has been shown to affect dynamics, the two Tc polymorphs, I and II, have not been analyzed in a thorough comparison of dynamics and photophysics. Tc II films show SF rates that are independent of crystallite size and SF occurs more rapidly than in Tc I. The slower Tc I SF rates are highly dependent on grain size. Coupling calculations suggested that Tc I should be faster, but these calculations are limited, and more sophisticated, multimolecule calculations are needed to support experimental results. Two extremely stable indigo derivatives, Cibalackrot (2) and a tert-butylated derivative(3) were structurally and photophysically characterized in solution and in TFs. Two crystalline polymorphs ( 2alpha, 2beta) and an amorphous phase (2a), as well as a crystalline (3alpha) and amorphous (3a) phase of 3 were deposited by thermal evaporation. phiT values of less than 25% were observed for all morphologies, except in 2beta(phi T= 50%). Excimer formation dominates relaxation pathways in TFs of 2 and 3.

  7. Computerized evaluation of holographic interferograms for fatigue crack detection in riveted lap joints

    NASA Astrophysics Data System (ADS)

    Zhou, Xiang

    Using an innovative portable holographic inspection and testing system (PHITS) developed at the Australian Defence Force Academy, fatigue cracks in riveted lap joints can be detected by visually inspecting the abnormal fringe changes recorded on holographic interferograms. In this thesis, for automatic crack detection, some modern digital image processing techniques are investigated and applied to holographic interferogram evaluation. Fringe analysis algorithms are developed for identification of the crack-induced fringe changes. Theoretical analysis of PHITS and riveted lap joints and two typical experiments demonstrate that the fatigue cracks in lightly-clamped joints induce two characteristic fringe changes: local fringe discontinuities at the cracking sites; and the global crescent fringe distribution near to the edge of the rivet hole. Both of the fringe features are used for crack detection in this thesis. As a basis of the fringe feature extraction, an algorithm for local fringe orientation calculation is proposed. For high orientation accuracy and computational efficiency, Gaussian gradient filtering and neighboring direction averaging are used to minimize the effects of image background variations and random noise. The neighboring direction averaging is also used to approximate the fringe directions in centerlines of bright and dark fringes. Experimental results indicate that for high orientation accuracy the scales of the Gaussian filter and neighboring direction averaging should be chosen according to the local fringe spacings. The orientation histogram technique is applied to detect the local fringe discontinuity due to the fatigue cracks. The Fourier descriptor technique is used to characterize the global fringe distribution change from a circular to a crescent distribution with the fatigue crack growth. Experiments and computer simulations are conducted to analyze the detectability and reliability of crack detection using the two techniques. Results demonstrate that the Fourier descriptor technique is more promising in the detection of the short cracks near the edge of the rivet head. However, it is not as reliable as the fringe orientation technique for detection of the long through cracks. For reliability, both techniques should be used in practical crack detection. Neither the Fourier descriptor technique nor the orientation histogram technique have been previously applied to holographic interferometry. While this work related primarily to interferograms of cracked rivets, the techniques would be readily applied to other areas of fringe pattern analysis.

  8. A simple approach to improve recording of concerns about childmaltreatment in primary care records: developing a quality improvement intervention

    PubMed Central

    Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth

    2012-01-01

    Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996

  9. Optimized iterative decoding method for TPC coded CPM

    NASA Astrophysics Data System (ADS)

    Ma, Yanmin; Lai, Penghui; Wang, Shilian; Xie, Shunqin; Zhang, Wei

    2018-05-01

    Turbo Product Code (TPC) coded Continuous Phase Modulation (CPM) system (TPC-CPM) has been widely used in aeronautical telemetry and satellite communication. This paper mainly investigates the improvement and optimization on the TPC-CPM system. We first add the interleaver and deinterleaver to the TPC-CPM system, and then establish an iterative system to iteratively decode. However, the improved system has a poor convergence ability. To overcome this issue, we use the Extrinsic Information Transfer (EXIT) analysis to find the optimal factors for the system. The experiments show our method is efficient to improve the convergence performance.

  10. An Object-Oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2009-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.

  11. PheProb: probabilistic phenotyping using diagnosis codes to improve power for genetic association studies.

    PubMed

    Sinnott, Jennifer A; Cai, Fiona; Yu, Sheng; Hejblum, Boris P; Hong, Chuan; Kohane, Isaac S; Liao, Katherine P

    2018-05-17

    Standard approaches for large scale phenotypic screens using electronic health record (EHR) data apply thresholds, such as ≥2 diagnosis codes, to define subjects as having a phenotype. However, the variation in the accuracy of diagnosis codes can impair the power of such screens. Our objective was to develop and evaluate an approach which converts diagnosis codes into a probability of a phenotype (PheProb). We hypothesized that this alternate approach for defining phenotypes would improve power for genetic association studies. The PheProb approach employs unsupervised clustering to separate patients into 2 groups based on diagnosis codes. Subjects are assigned a probability of having the phenotype based on the number of diagnosis codes. This approach was developed using simulated EHR data and tested in a real world EHR cohort. In the latter, we tested the association between low density lipoprotein cholesterol (LDL-C) genetic risk alleles known for association with hyperlipidemia and hyperlipidemia codes (ICD-9 272.x). PheProb and thresholding approaches were compared. Among n = 1462 subjects in the real world EHR cohort, the threshold-based p-values for association between the genetic risk score (GRS) and hyperlipidemia were 0.126 (≥1 code), 0.123 (≥2 codes), and 0.142 (≥3 codes). The PheProb approach produced the expected significant association between the GRS and hyperlipidemia: p = .001. PheProb improves statistical power for association studies relative to standard thresholding approaches by leveraging information about the phenotype in the billing code counts. The PheProb approach has direct applications where efficient approaches are required, such as in Phenome-Wide Association Studies.

  12. Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.

    1972-01-01

    A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.

  13. Constructing a Pre-Emptive System Based on a Multidimentional Matrix and Autocompletion to Improve Diagnostic Coding in Acute Care Hospitals.

    PubMed

    Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice

    2016-01-01

    Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.

  14. Coding in Muscle Disease.

    PubMed

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  15. Coded excitation with spectrum inversion (CEXSI) for ultrasound array imaging.

    PubMed

    Wang, Yao; Metzger, Kurt; Stephens, Douglas N; Williams, Gregory; Brownlie, Scott; O'Donnell, Matthew

    2003-07-01

    In this paper, a scheme called coded excitation with spectrum inversion (CEXSI) is presented. An established optimal binary code whose spectrum has no nulls and possesses the least variation is encoded as a burst for transmission. Using this optimal code, the decoding filter can be derived directly from its inverse spectrum. Various transmission techniques can be used to improve energy coupling within the system pass-band. We demonstrate its potential to achieve excellent decoding with very low (< 80 dB) side-lobes. For a 2.6 micros code, an array element with a center frequency of 10 MHz and fractional bandwidth of 38%, range side-lobes of about 40 dB have been achieved experimentally with little compromise in range resolution. The signal-to-noise ratio (SNR) improvement also has been characterized at about 14 dB. Along with simulations and experimental data, we present a formulation of the scheme, according to which CEXSI can be extended to improve SNR in sparse array imaging in general.

  16. Improved Iris Recognition through Fusion of Hamming Distance and Fragile Bit Distance.

    PubMed

    Hollingsworth, Karen P; Bowyer, Kevin W; Flynn, Patrick J

    2011-12-01

    The most common iris biometric algorithm represents the texture of an iris using a binary iris code. Not all bits in an iris code are equally consistent. A bit is deemed fragile if its value changes across iris codes created from different images of the same iris. Previous research has shown that iris recognition performance can be improved by masking these fragile bits. Rather than ignoring fragile bits completely, we consider what beneficial information can be obtained from the fragile bits. We find that the locations of fragile bits tend to be consistent across different iris codes of the same eye. We present a metric, called the fragile bit distance, which quantitatively measures the coincidence of the fragile bit patterns in two iris codes. We find that score fusion of fragile bit distance and Hamming distance works better for recognition than Hamming distance alone. To our knowledge, this is the first and only work to use the coincidence of fragile bit locations to improve the accuracy of matches.

  17. An evaluation of TRAC-PF1/MOD1 computer code performance during posttest simulations of Semiscale MOD-2C feedwater line break transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D.G.: Watkins, J.C.

    This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less

  18. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  19. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  20. Opinion survey on proposals for improving code stroke in Murcia Health District V, 2014.

    PubMed

    González-Navarro, M; Martínez-Sánchez, M A; Morales-Camacho, V; Valera-Albert, M; Atienza-Ayala, S V; Limiñana-Alcaraz, G

    2017-05-01

    Stroke is a time-dependent neurological disease. Health District V in the Murcia Health System has certain demographic and geographical characteristics that make it necessary to create specific improvement strategies to ensure proper functioning of code stroke (CS). The study objectives were to assess local professionals' opinions about code stroke activation and procedure, and to share these suggestions with the regional multidisciplinary group for code stroke. This cross-sectional and descriptive study used the Delphi technique to develop a questionnaire for doctors and nurses working at all care levels in Area V. An anonymous electronic survey was sent to 154 professionals. The analysis was performed using the SWOT method (Strengths, Weaknesses, Opportunities, and Threats). Researchers collected 51 questionnaires. The main proposals were providing training, promoting communication with the neurologist, overcoming physical distances, using diagnostic imaging tests, motivating professionals, and raising awareness in the general population. Most of the interventions proposed by the participants have been listed in published literature. These improvement proposals were forwarded to the Regional Code Stroke Improvement Group. Copyright © 2015 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Use the Bar Code System to Improve Accuracy of the Patient and Sample Identification.

    PubMed

    Chuang, Shu-Hsia; Yeh, Huy-Pzu; Chi, Kun-Hung; Ku, Hsueh-Chen

    2018-01-01

    In time and correct sample collection were highly related to patient's safety. The sample error rate was 11.1%, because misbranded patient information and wrong sample containers during January to April, 2016. We developed a barcode system of "Specimens Identify System" through process of reengineering of TRM, used bar code scanners, add sample container instructions, and mobile APP. Conclusion, the bar code systems improved the patient safety and created green environment.

  2. Determination of Problematic ICD-9-CM Subcategories for Further Study of Coding Performance: Delphi Method

    PubMed Central

    Zeng, Xiaoming; Bell, Paul D

    2011-01-01

    In this study, we report on a qualitative method known as the Delphi method, used in the first part of a research study for improving the accuracy and reliability of ICD-9-CM coding. A panel of independent coding experts interacted methodically to determine that the three criteria to identify a problematic ICD-9-CM subcategory for further study were cost, volume, and level of coding confusion caused. The Medicare Provider Analysis and Review (MEDPAR) 2007 fiscal year data set as well as suggestions from the experts were used to identify coding subcategories based on cost and volume data. Next, the panelists performed two rounds of independent ranking before identifying Excisional Debridement as the subcategory that causes the most confusion among coders. As a result, they recommended it for further study aimed at improving coding accuracy and variation. This framework can be adopted at different levels for similar studies in need of a schema for determining problematic subcategories of code sets. PMID:21796264

  3. Light element opacities of astrophysical interest from ATOMIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colgan, J.; Kilcrease, D. P.; Magee, N. H. Jr.

    We present new calculations of local-thermodynamic-equilibrium (LTE) light element opacities from the Los Alamos ATOMIC code for systems of astrophysical interest. ATOMIC is a multi-purpose code that can generate LTE or non-LTE quantities of interest at various levels of approximation. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. The ATOMIC code uses ab-initio atomic structure data computed from the CATS code, which is based on Cowan's atomic structure codes, and photoionization cross section data computed from the Los Alamos ionization code GIPPER. ATOMIC also incorporates a newmore » equation-of-state (EOS) model based on the chemical picture. ATOMIC incorporates some physics packages from LEDCOP and also includes additional physical processes, such as improved free-free cross sections and additional scattering mechanisms. Our new calculations are made for elements of astrophysical interest and for a wide range of temperatures and densities.« less

  4. Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.

  5. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  6. Ultrasound strain imaging using Barker code

    NASA Astrophysics Data System (ADS)

    Peng, Hui; Tie, Juhong; Guo, Dequan

    2017-01-01

    Ultrasound strain imaging is showing promise as a new way of imaging soft tissue elasticity in order to help clinicians detect lesions or cancers in tissues. In this paper, Barker code is applied to strain imaging to improve its quality. Barker code as a coded excitation signal can be used to improve the echo signal-to-noise ratio (eSNR) in ultrasound imaging system. For the Baker code of length 13, the sidelobe level of the matched filter output is -22dB, which is unacceptable for ultrasound strain imaging, because high sidelobe level will cause high decorrelation noise. Instead of using the conventional matched filter, we use the Wiener filter to decode the Barker-coded echo signal to suppress the range sidelobes. We also compare the performance of Barker code and the conventional short pulse in simulation method. The simulation results demonstrate that the performance of the Wiener filter is much better than the matched filter, and Baker code achieves higher elastographic signal-to-noise ratio (SNRe) than the short pulse in low eSNR or great depth conditions due to the increased eSNR with it.

  7. Wake coupling to full potential rotor analysis code

    NASA Technical Reports Server (NTRS)

    Torres, Francisco J.; Chang, I-Chung; Oh, Byung K.

    1990-01-01

    The wake information from a helicopter forward flight code is coupled with two transonic potential rotor codes. The induced velocities for the near-, mid-, and far-wake geometries are extracted from a nonlinear rigid wake of a standard performance and analysis code. These, together with the corresponding inflow angles, computation points, and azimuth angles, are then incorporated into the transonic potential codes. The coupled codes can then provide an improved prediction of rotor blade loading at transonic speeds.

  8. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  9. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    PubMed

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p < 0.01); for new patients the monthly average E&M level increased from 2.61 to 3.19 (p < 0.01). This study describes a series of educational and workflow interventions, which improved resident coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. An Object-oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2008-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented

  11. Coded excitation ultrasonic needle tracking: An in vivo study.

    PubMed

    Xia, Wenfeng; Ginsberg, Yuval; West, Simeon J; Nikitichev, Daniil I; Ourselin, Sebastien; David, Anna L; Desjardins, Adrien E

    2016-07-01

    Accurate and efficient guidance of medical devices to procedural targets lies at the heart of interventional procedures. Ultrasound imaging is commonly used for device guidance, but determining the location of the device tip can be challenging. Various methods have been proposed to track medical devices during ultrasound-guided procedures, but widespread clinical adoption has remained elusive. With ultrasonic tracking, the location of a medical device is determined by ultrasonic communication between the ultrasound imaging probe and a transducer integrated into the medical device. The signal-to-noise ratio (SNR) of the transducer data is an important determinant of the depth in tissue at which tracking can be performed. In this paper, the authors present a new generation of ultrasonic tracking in which coded excitation is used to improve the SNR without spatial averaging. A fiber optic hydrophone was integrated into the cannula of a 20 gauge insertion needle. This transducer received transmissions from the ultrasound imaging probe, and the data were processed to obtain a tracking image of the needle tip. Excitation using Barker or Golay codes was performed to improve the SNR, and conventional bipolar excitation was performed for comparison. The performance of the coded excitation ultrasonic tracking system was evaluated in an in vivo ovine model with insertions to the brachial plexus and the uterine cavity. Coded excitation significantly increased the SNRs of the tracking images, as compared with bipolar excitation. During an insertion to the brachial plexus, the SNR was increased by factors of 3.5 for Barker coding and 7.1 for Golay coding. During insertions into the uterine cavity, these factors ranged from 2.9 to 4.2 for Barker coding and 5.4 to 8.5 for Golay coding. The maximum SNR was 670, which was obtained with Golay coding during needle withdrawal from the brachial plexus. Range sidelobe artifacts were observed in tracking images obtained with Barker coded excitation, and they were visually absent with Golay coded excitation. The spatial tracking accuracy was unaffected by coded excitation. Coded excitation is a viable method for improving the SNR in ultrasonic tracking without compromising spatial accuracy. This method provided SNR increases that are consistent with theoretical expectations, even in the presence of physiological motion. With the ultrasonic tracking system in this study, the SNR increases will have direct clinical implications in a broad range of interventional procedures by improving visibility of medical devices at large depths.

  12. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  13. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L. N.

    1976-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  14. SPAR improved structure-fluid dynamic analysis capability, phase 2

    NASA Technical Reports Server (NTRS)

    Pearson, M. L.

    1984-01-01

    An efficient and general method of analyzing a coupled dynamic system of fluid flow and elastic structures is investigated. The improvement of Structural Performance Analysis and Redesign (SPAR) code is summarized. All error codes are documented and the SPAR processor/subroutine cross reference is included.

  15. Improving Access to and Understanding of Regulations through Taxonomies

    ERIC Educational Resources Information Center

    Cheng, Chin Pang; Lau. Gloria T.; Law, Kincho H.; Pan, Jiayi; Jones, Albert

    2009-01-01

    Industrial taxonomies have the potential to automate information retrieval, facilitate interoperability and, most importantly, improve decision making - decisions that must comply with existing government regulations and codes of practice. However, it is difficult to find those regulations and codes most relevant to a particular decision, even…

  16. New GOES satellite synchronized time code generation

    NASA Technical Reports Server (NTRS)

    Fossler, D. E.; Olson, R. K.

    1984-01-01

    The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.

  17. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  18. Data compression for satellite images

    NASA Technical Reports Server (NTRS)

    Chen, P. H.; Wintz, P. A.

    1976-01-01

    An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.

  19. One-way quantum repeaters with quantum Reed-Solomon codes

    NASA Astrophysics Data System (ADS)

    Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang

    2018-05-01

    We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.

  20. Optimized atom position and coefficient coding for matching pursuit-based image compression.

    PubMed

    Shoa, Alireza; Shirani, Shahram

    2009-12-01

    In this paper, we propose a new encoding algorithm for matching pursuit image coding. We show that coding performance is improved when correlations between atom positions and atom coefficients are both used in encoding. We find the optimum tradeoff between efficient atom position coding and efficient atom coefficient coding and optimize the encoder parameters. Our proposed algorithm outperforms the existing coding algorithms designed for matching pursuit image coding. Additionally, we show that our algorithm results in better rate distortion performance than JPEG 2000 at low bit rates.

  1. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  2. Current and anticipated use of thermal-hydraulic codes for BWR transient and accident analyses in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arai, Kenji; Ebata, Shigeo

    1997-07-01

    This paper summarizes the current and anticipated use of the thermal-hydraulic and neutronic codes for the BWR transient and accident analyses in Japan. The codes may be categorized into the licensing codes and the best estimate codes for the BWR transient and accident analyses. Most of the licensing codes have been originally developed by General Electric. Some codes have been updated based on the technical knowledge obtained in the thermal hydraulic study in Japan, and according to the BWR design changes. The best estimates codes have been used to support the licensing calculations and to obtain the phenomenological understanding ofmore » the thermal hydraulic phenomena during a BWR transient or accident. The best estimate codes can be also applied to a design study for a next generation BWR to which the current licensing model may not be directly applied. In order to rationalize the margin included in the current BWR design and develop a next generation reactor with appropriate design margin, it will be required to improve the accuracy of the thermal-hydraulic and neutronic model. In addition, regarding the current best estimate codes, the improvement in the user interface and the numerics will be needed.« less

  3. Methods of treating complex space vehicle geometry for charged particle radiation transport

    NASA Technical Reports Server (NTRS)

    Hill, C. W.

    1973-01-01

    Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.

  4. AspectAssay: A Technique for Expanding the Pool of Available Aspect Mining Test Data Using Concern Seeding

    ERIC Educational Resources Information Center

    Moore, David G., Jr.

    2013-01-01

    Aspect-oriented software design (AOSD) enables better and more complete separation of concerns in software-intensive systems. By extracting aspect code and relegating crosscutting functionality to aspects, software engineers can improve the maintainability of their code by reducing code tangling and coupling of code concerns. Further, the number…

  5. 75 FR 20870 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-21

    ... for Improved Outer Tactical Vests and related accessories under Product Service Code (PSC) 8470 (Armor... granting a waiver of the Nonmanufacturer Rule for Improved Outer Tactical Vests. According to a request, no... Outer Tactical Vests and related accessories under NAICS code 339113, Surgical Appliance and Supplies...

  6. To make improvements in the enactment of title 41, United States Code, into a positive law title and to improve the Code.

    THOMAS, 112th Congress

    Rep. Smith, Lamar [R-TX-21

    2012-07-09

    Senate - 09/12/2012 Received in the Senate and Read twice and referred to the Committee on the Judiciary. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  7. Language Recognition via Sparse Coding

    DTIC Science & Technology

    2016-09-08

    a posteriori (MAP) adaptation scheme that further optimizes the discriminative quality of sparse-coded speech fea - tures. We empirically validate the...significantly improve the discriminative quality of sparse-coded speech fea - tures. In Section 4, we evaluate the proposed approaches against an i-vector

  8. National Combustion Code: Parallel Implementation and Performance

    NASA Technical Reports Server (NTRS)

    Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.

    2000-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.

  9. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  10. Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.

    PubMed

    Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R

    2015-12-01

    To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Auto-Coding UML Statecharts for Flight Software

    NASA Technical Reports Server (NTRS)

    Benowitz, Edward G; Clark, Ken; Watney, Garth J.

    2006-01-01

    Statecharts have been used as a means to communicate behaviors in a precise manner between system engineers and software engineers. Hand-translating a statechart to code, as done on some previous space missions, introduces the possibility of errors in the transformation from chart to code. To improve auto-coding, we have developed a process that generates flight code from UML statecharts. Our process is being used for the flight software on the Space Interferometer Mission (SIM).

  12. Additional Improvements to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Bidwell, Colin S.

    1995-01-01

    Due to the feedback of the user community, three major features have been added to the NASA Lewis ice accretion code LEWICE. These features include: first, further improvements to the numerics of the code so that more time steps can be run and so that the code is more stable; second, inclusion and refinement of the roughness prediction model described in an earlier paper; third, inclusion of multi-element trajectory and ice accretion capabilities to LEWICE. This paper will describe each of these advancements in full and make comparisons with the experimental data available. Further refinement of these features and inclusion of additional features will be performed as more feedback is received.

  13. Colour coding scrubs as a means of improving perioperative communication.

    PubMed

    Litak, Dominika

    2011-05-01

    Effective communication within the operating department is essential for achieving patient safety. A large part of the perioperative communication is non-verbal. One type of non-verbal communication is 'object communication', the most common form of which is clothing. The colour coding of clothing such as scrubs has the potential to optimise perioperative communication with the patients and between the staff. A colour contains a coded message, and is a visual cue for an immediate identification of personnel. This is of key importance in the perioperative environment. The idea of colour coded scrubs in the perioperative setting has not been much explored to date and, given the potential contributiontowards improvement of patient outcomes, deserves consideration.

  14. Light transport feature for SCINFUL.

    PubMed

    Etaati, G R; Ghal-Eh, N

    2008-03-01

    An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.

  15. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  16. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    NASA Astrophysics Data System (ADS)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  17. Improved Helicopter Rotor Performance Prediction through Loose and Tight CFD/CSD Coupling

    NASA Astrophysics Data System (ADS)

    Ickes, Jacob C.

    Helicopters and other Vertical Take-Off or Landing (VTOL) vehicles exhibit an interesting combination of structural dynamic and aerodynamic phenomena which together drive the rotor performance. The combination of factors involved make simulating the rotor a challenging and multidisciplinary effort, and one which is still an active area of interest in the industry because of the money and time it could save during design. Modern tools allow the prediction of rotorcraft physics from first principles. Analysis of the rotor system with this level of accuracy provides the understanding necessary to improve its performance. There has historically been a divide between the comprehensive codes which perform aeroelastic rotor simulations using simplified aerodynamic models, and the very computationally intensive Navier-Stokes Computational Fluid Dynamics (CFD) solvers. As computer resources become more available, efforts have been made to replace the simplified aerodynamics of the comprehensive codes with the more accurate results from a CFD code. The objective of this work is to perform aeroelastic rotorcraft analysis using first-principles simulations for both fluids and structural predictions using tools available at the University of Toledo. Two separate codes are coupled together in both loose coupling (data exchange on a periodic interval) and tight coupling (data exchange each time step) schemes. To allow the coupling to be carried out in a reliable and efficient way, a Fluid-Structure Interaction code was developed which automatically performs primary functions of loose and tight coupling procedures. Flow phenomena such as transonics, dynamic stall, locally reversed flow on a blade, and Blade-Vortex Interaction (BVI) were simulated in this work. Results of the analysis show aerodynamic load improvement due to the inclusion of the CFD-based airloads in the structural dynamics analysis of the Computational Structural Dynamics (CSD) code. Improvements came in the form of improved peak/trough magnitude prediction, better phase prediction of these locations, and a predicted signal with a frequency content more like the flight test data than the CSD code acting alone. Additionally, a tight coupling analysis was performed as a demonstration of the capability and unique aspects of such an analysis. This work shows that away from the center of the flight envelope, the aerodynamic modeling of the CSD code can be replaced with a more accurate set of predictions from a CFD code with an improvement in the aerodynamic results. The better predictions come at substantially increased computational costs between 1,000 and 10,000 processor-hours.

  18. You've Written a Cool Astronomy Code! Now What Do You Do with It?

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Accomazzi, A.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P. J.; Wallin, J. F.

    2014-01-01

    Now that you've written a useful astronomy code for your soon-to-be-published research, you have to figure out what you want to do with it. Our suggestion? Share it! This presentation highlights the means and benefits of sharing your code. Make your code citable -- submit it to the Astrophysics Source Code Library and have it indexed by ADS! The Astrophysics Source Code Library (ASCL) is a free online registry of source codes of interest to astronomers and astrophysicists. With over 700 codes, it is continuing its rapid growth, with an average of 17 new codes a month. The editors seek out codes for inclusion; indexing by ADS improves the discoverability of codes and provides a way to cite codes as separate entries, especially codes without papers that describe them.

  19. LDPC coded OFDM over the atmospheric turbulence channel.

    PubMed

    Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A

    2007-05-14

    Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).

  20. TRIAD IV: Nationwide Survey of Medical Students' Understanding of Living Wills and DNR Orders.

    PubMed

    Mirarchi, Ferdinando L; Ray, Matthew; Cooney, Timothy

    2016-12-01

    Living wills are a form of advance directives that help to protect patient autonomy. They are frequently encountered in the conduct of medicine. Because of their impact on care, it is important to understand the adequacy of current medical school training in the preparation of physicians to interpret these directives. Between April and August 2011 of third and fourth year medical students participated in an internet survey involving the interpretation of living wills. The survey presented a standard living will as a "stand-alone," a standard living will with the addition an emergent clinical scenario and then variations of the standard living will that included a code status designation ("DNR," "Full Code," or "Comfort Care"). For each version/ scenario, respondents were asked to assign a code status and choose interventions based on the cases presented. Four hundred twenty-five students from medical schools throughout the country responded. The majority indicated they had received some form of advance directive training and understood the concept of code status and the term "DNR." Based on a stand-alone document, 15% of respondents correctly denoted "full code" as the appropriate code status; adding a clinical scenario yielded negligible improvement. When a code designation was added to the living will, correct code status responses ranged from 68% to 93%, whereas correct treatment decisions ranged from 18% to 78%. Previous training in advance directives had no impact on these results. Our data indicate that the majority of students failed to understand the key elements of a living will; adding a code status designations improved correct responses with the exception of the term DNR. Misunderstanding of advance directives is a nationwide problem and jeopardizes patient safety. Medical School ethics curricula need to be improved to ensure competency with respect to understanding advance directives.

  1. The impact of three discharge coding methods on the accuracy of diagnostic coding and hospital reimbursement for inpatient medical care.

    PubMed

    Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C

    2018-07-01

    Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p < 0.0001) or coded from the discharge summary with medical support (70% vs 60% respectively, p < 0.0001). When compared with the gold standard, the percentage of incorrect HRGs was 42% for discharge summary alone, 31% for coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Fast transform decoding of nonsystematic Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Cheung, K.-M.; Reed, I. S.; Shiozaki, A.

    1989-01-01

    A Reed-Solomon (RS) code is considered to be a special case of a redundant residue polynomial (RRP) code, and a fast transform decoding algorithm to correct both errors and erasures is presented. This decoding scheme is an improvement of the decoding algorithm for the RRP code suggested by Shiozaki and Nishida, and can be realized readily on very large scale integration chips.

  3. Adaptive format conversion for scalable video coding

    NASA Astrophysics Data System (ADS)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  4. Using Inspections to Improve the Quality of Product Documentation and Code.

    ERIC Educational Resources Information Center

    Zuchero, John

    1995-01-01

    Describes how, by adapting software inspections to assess documentation and code, technical writers can collaborate with development personnel, editors, and customers to dramatically improve both the quality of documentation and the very process of inspecting that documentation. Notes that the five steps involved in the inspection process are:…

  5. High-speed architecture for the decoding of trellis-coded modulation

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  6. Building Energy Codes: Policy Overview and Good Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie

    2016-02-19

    Globally, 32% of total final energy consumption is attributed to the building sector. To reduce energy consumption, energy codes set minimum energy efficiency standards for the building sector. With effective implementation, building energy codes can support energy cost savings and complementary benefits associated with electricity reliability, air quality improvement, greenhouse gas emission reduction, increased comfort, and economic and social development. This policy brief seeks to support building code policymakers and implementers in designing effective building code programs.

  7. Crosstalk eliminating and low-density parity-check codes for photochromic dual-wavelength storage

    NASA Astrophysics Data System (ADS)

    Wang, Meicong; Xiong, Jianping; Jian, Jiqi; Jia, Huibo

    2005-01-01

    Multi-wavelength storage is an approach to increase the memory density with the problem of crosstalk to be deal with. We apply Low Density Parity Check (LDPC) codes as error-correcting codes in photochromic dual-wavelength optical storage based on the investigation of LDPC codes in optical data storage. A proper method is applied to reduce the crosstalk and simulation results show that this operation is useful to improve Bit Error Rate (BER) performance. At the same time we can conclude that LDPC codes outperform RS codes in crosstalk channel.

  8. Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L.

    2017-05-01

    MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.

  9. Refactoring and Its Benefits

    NASA Astrophysics Data System (ADS)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-01

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations, which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor. Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.

  10. Refactoring and Its Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-26

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations,more » which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor.Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.« less

  11. NR-code: Nonlinear reconstruction code

    NASA Astrophysics Data System (ADS)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  12. The radiation environment on the surface of Mars - Summary of model calculations and comparison to RAD data

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Hassler, Donald M.; de Wet, Wouter; Ehresmann, Bent; Firan, Ana; Flores-McLaughlin, John; Guo, Jingnan; Heilbronn, Lawrence H.; Lee, Kerry; Ratliff, Hunter; Rios, Ryan R.; Slaba, Tony C.; Smith, Michael; Stoffle, Nicholas N.; Townsend, Lawrence W.; Berger, Thomas; Reitz, Günther; Wimmer-Schweingruber, Robert F.; Zeitlin, Cary

    2017-08-01

    The radiation environment at the Martian surface is, apart from occasional solar energetic particle events, dominated by galactic cosmic radiation, secondary particles produced in their interaction with the Martian atmosphere and albedo particles from the Martian regolith. The highly energetic primary cosmic radiation consists mainly of fully ionized nuclei creating a complex radiation field at the Martian surface. This complex field, its formation and its potential health risk posed to astronauts on future manned missions to Mars can only be fully understood using a combination of measurements and model calculations. In this work the outcome of a workshop held in June 2016 in Boulder, CO, USA is presented: experimental results from the Radiation Assessment Detector of the Mars Science Laboratory are compared to model results from GEANT4, HETC-HEDS, HZETRN, MCNP6, and PHITS. Charged and neutral particle spectra and dose rates measured between 15 November 2015 and 15 January 2016 and model results calculated for this time period are investigated.

  13. Spallation reaction study for the long-lived fission product 107Pd

    NASA Astrophysics Data System (ADS)

    Wang, He; Otsu, Hideaki; Sakurai, Hiroyoshi; Ahn, DeukSoon; Aikawa, Masayuki; Ando, Takashi; Araki, Shouhei; Chen, Sidong; Nobuyuki, Chiga; Doornenbal, Pieter; Fukuda, Naoki; Isobe, Tadaaki; Kawakami, Shunsuke; Kawase, Shoichiro; Kin, Tadahiro; Kondo, Yosuke; Koyama, Shunpei; Kubono, Shigeru; Maeda, Yukie; Makinaga, Ayano; Matsushita, Masafumi; Matsuzaki, Teiichiro; Michimasa, Shin'ichiro; Momiyama, Satoru; Nagamine, Shunsuke; Nakamura, Takashi; Nakano, Keita; Niikura, Megumi; Ozaki, Tomoyuki; Saito, Atsumi; Saito, Takeshi; Shiga, Yoshiaki; Shikata, Mizuki; Shimizu, Yohei; Shimoura, Susumu; Sumikama, Toshiyuki; Söderström, Pär-Anders; Suzuki, Hiroshi; Takeda, Hiroyuki; Takeuchi, Satoshi; Taniuchi, Ryo; Togano, Yasuhiro; Tsubota, Junichi; Uesaka, Meiko; Watanabe, Yasushi; Watanabe, Yukinobu; Wimmer, Kathrin; Yamamoto, Tatsuya; Yoshida, Koichi

    2017-02-01

    Spallation reactions for the long-lived fission product 107Pd have been studied for the purpose of nuclear waste transmutation. The cross sections on the proton- and deuteron-induced spallation were obtained at 196 and 118 MeV/nucleon in inverse kinematics at the RIKEN Radioactive Isotope Beam Factory. Both the target and energy dependences of cross sections have been investigated systematically. It was found that the proton-induced cross sections at 196 MeV/nucleon are close to those for deuteron obtained at 118 MeV/nucleon for the light-mass products. The experimental data are compared with the SPACS semi-empirical parameterization and the PHITS calculations including both the intranuclear cascade and evaporation processes. Our data give a design goal of proton/deuteron flux for the transmutation of 107Pd using the spallation reaction. In addition, it is found that the spallation reaction at 118 MeV/nucleon may have an advantage over the 107Pd transmutation because of the low production of other long-lived radioactive isotopes.

  14. SU-E-T-412: Evaluation of Tungsten-Based Functional Paper for Attenuation Device in Intraoperative Radiotherapy for Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamomae, T; Monzen, H; Okudaira, K

    Purpose: Intraoperative radiotherapy (IORT) with an electron beam is one of the accelerated partial breast irradiation methods that have recently been used in early-stage breast cancer. A protective acrylic resin-copper disk is inserted between the breast tissue and the pectoralis muscle to minimize the dose received by the posterior structures. However, a problem with this protective disk is that the surgical incision must be larger than the field size because the disk is manufactured from stiff and unyielding materials. The purpose of this study was to assess the applicability of a new tungsten-based functional paper (TFP) as an alternative tomore » the existing protective disk in IORT. Methods: The newly introduced TFP (Toppan Printing Co., Ltd., Tokyo, JP) is anticipated to become a useful device that is lead-free, light, flexible, and easily processed. The radiation shielding performance of TFP was verified by experimental measurements and Monte Carlo (MC) simulations using PHITS code. The doses transmitted through the protective disk or TFP were measured on a Mobetron mobile accelerator. The same geometries were then reproduced, and the dose distributions were simulated by the MC method. Results: The percentages of transmitted dose relative to the absence of the existing protective disk were lower than 2% in both the measurements and MC simulations. In the experimental measurements, the percentages of transmitted dose for a 9 MeV electron beam were 48.1, 2.3, and 0.6% with TFP thicknesses of 1.9, 3.7, and 7.4 mm, respectively. The percentages for a 12 MeV were 76.0, 49.3, 20.0, and 5.5% with TFP thicknesses of 1.9, 3.7, 7.4, and 14.8 mm, respectively. The results of the MC simulation showed a slight dose increase at the incident surface of the TFP caused by backscattered radiation. Conclusion: The results indicate that a small-incision procedure may be possible by the use of TFP.« less

  15. Energy spectrum and dose enhancement due to the depth of the Lipiodol position using flattened and unflattened beams.

    PubMed

    Kawahara, Daisuke; Ozawa, Shuichi; Saito, Akito; Kimura, Tomoki; Suzuki, Tatsuhiko; Tsuneda, Masato; Tanaka, Sodai; Hioki, Kazunari; Nakashima, Takeo; Ohno, Yoshimi; Murakami, Yuji; Nagata, Yasushi

    2018-01-01

    Lipiodol was used for stereotactic body radiotherapy combining trans arterial chemoembolization. Lipiodol used for tumour seeking in trans arterial chemoembolization remains in stereotactic body radiation therapy. In our previous study, we reported the dose enhancement effect in Lipiodol with 10× flattening-filter-free (FFF). The objective of our study was to evaluate the dose enhancement and energy spectrum of photons and electrons due to the Lipiodol depth with flattened (FF) and FFF beams. FF and FFF for 6 MV beams from TrueBeam were used in this study. The Lipiodol (3 × 3 × 3 cm 3 ) was located at depths of 1, 3, 5, 10, 20, and 30 cm in water. The dose enhancement factor (DEF) and the energy fluence were obtained by Monte Carlo calculations of the particle and heavy ion transport code system (PHITS). The DEFs at the centre of Lipiodol with the FF beam were 6.8, 7.3, 7.6, 7.2, 6.1, and 5.7% and those with the FFF beam were 20.6, 22.0, 21.9, 20.0, 12.3, and 12.1% at depths of 1, 3, 5, 10, 20, and 30 cm, respectively, where Lipiodol was located in water. Moreover, spectrum results showed that more low-energy photons and electrons were present at shallow depth where Lipiodol was located in water. The variation in the low-energy spectrum due to the depth of the Lipiodol position was more explicit with the FFF beam than that with the FF beam. The current study revealed variations in the DEF and energy spectrum due to the depth of the Lipiodol position with the FF and FFF beams. Although the FF beam could reduce the effect of energy dependence due to the depth of the Lipiodol position, the dose enhancement was overall small. To cause a large dose enhancement, the FFF beam with the distance of the patient surface to Lipiodol within 10 cm should be used.

  16. WE-H-BRA-09: Application of a Modified Microdosimetric-Kinetic Model to Analyze Relative Biological Effectiveness of Ions Relevant to Light Ion Therapy Using the Particle Heavy Ion Transport System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butkus, M; Palmer, T

    Purpose: To evaluate the dose and biological effectiveness of various ions that could potentially be used for actively scanned particle therapy. Methods: The PHITS Monte Carlo code paired with a microscopic analytical function was used to determine probability distribution functions of the lineal energy in 0.3µm diameter spheres throughout a water phantom. Twenty million primary particles for 1H beams and ten million particles for 4He, 7Li, 10B, 12C, 14N, 16O, and 20Ne were simulated for 0.6cm diameter pencil beams. Beam energies corresponding to Bragg peak depths of 50, 100, 150, 200, 250, and 300mm were used and evaluated transversely everymore » millimeter and radially in annuli with outer radius of 1.0, 2.0, 3.0, 3.2, 3.4, 3.6, 4.0, 5.0, 10.0, 15.0, 20.0 and 25.0mm. The acquired probability distributions were reduced to dose-mean lineal energies and applied to the modified microdosimetric kinetic model for five different cell types to calculate relative biological effectiveness (RBE) compared to 60Co beams at the 10% survival threshold. The product of the calculated RBEs and the simulated physical dose was taken to create biological dose and comparisons were then made between the various ions. Results: Transversely, the 10B beam was seen to minimize relative biological dose in both the constant and accelerated dose change regions, proximal to the Bragg Peak, for all beams traveling greater than 50mm. For the 50mm beam, 7Li was seen to provide the most optimal biological dose profile. Radially small fluctuations (<4.2%) were seen in RBE while physical dose was greater than 1% for all beams. Conclusion: Even with the growing usage of 12C, it may not be the most optimal ion in all clinical situations. Boron was calculated to have slightly enhanced RBE characteristics, leading to lower relative biological doses.« less

  17. An efficient HZETRN (a galactic cosmic ray transport code)

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.

    1992-01-01

    An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.

  18. [ENT and head and neck surgery in the German DRG system 2007].

    PubMed

    Franz, D; Roeder, N; Hörmann, K; Alberty, J

    2007-07-01

    The German DRG system has been further developed into version 2007. For ENT and head and neck surgery, significant changes in the coding of diagnoses and medical operations as well as in the the DRG structure have been made. New ICD codes for sleep apnoea and acquired tracheal stenosis have been implemented. Surgery on the acoustic meatus, removal of auricle hyaline cartilage for transplantation (e. g. rhinosurgery) and tonsillotomy have been coded in the 2007 version. In addition, the DRG structure has been improved. Case allocation of more than one significant operation has been established. The G-DRG system has gained in complexity. High demands are made on the coding of complex cases, whereas standard cases require mostly only one specific diagnosis and one specific OPS code. The quality of case allocation for ENT patients within the G-DRG system has been improved. Nevertheless, further adjustments of the G-DRG system are necessary.

  19. Benchmarking of Improved DPAC Transient Deflagration Analysis Code

    DOE PAGES

    Laurinat, James E.; Hensel, Steve J.

    2017-09-27

    The deflagration pressure analysis code (DPAC) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vesselmore » walls. In addition, DPAC has been coupled with chemical equilibrium with applications (CEA), a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. As a result, the improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.« less

  20. Benchmarking of Improved DPAC Transient Deflagration Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurinat, James E.; Hensel, Steve J.

    The deflagration pressure analysis code (DPAC) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vesselmore » walls. In addition, DPAC has been coupled with chemical equilibrium with applications (CEA), a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. As a result, the improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.« less

  1. Improvements of the particle-in-cell code EUTERPE for petascaling machines

    NASA Astrophysics Data System (ADS)

    Sáez, Xavier; Soba, Alejandro; Sánchez, Edilberto; Kleiber, Ralf; Castejón, Francisco; Cela, José M.

    2011-09-01

    In the present work we report some performance measures and computational improvements recently carried out using the gyrokinetic code EUTERPE (Jost, 2000 [1] and Jost et al., 1999 [2]), which is based on the general particle-in-cell (PIC) method. The scalability of the code has been studied for up to sixty thousand processing elements and some steps towards a complete hybridization of the code were made. As a numerical example, non-linear simulations of Ion Temperature Gradient (ITG) instabilities have been carried out in screw-pinch geometry and the results are compared with earlier works. A parametric study of the influence of variables (step size of the time integrator, number of markers, grid size) on the quality of the simulation is presented.

  2. The effect of code expanding optimizations on instruction cache design

    NASA Technical Reports Server (NTRS)

    Chen, William Y.; Chang, Pohua P.; Conte, Thomas M.; Hwu, Wen-Mei W.

    1991-01-01

    It is shown that code expanding optimizations have strong and non-intuitive implications on instruction cache design. Three types of code expanding optimizations are studied: instruction placement, function inline expansion, and superscalar optimizations. Overall, instruction placement reduces the miss ratio of small caches. Function inline expansion improves the performance for small cache sizes, but degrades the performance of medium caches. Superscalar optimizations increases the cache size required for a given miss ratio. On the other hand, they also increase the sequentiality of instruction access so that a simple load-forward scheme effectively cancels the negative effects. Overall, it is shown that with load forwarding, the three types of code expanding optimizations jointly improve the performance of small caches and have little effect on large caches.

  3. Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits.

    PubMed

    Ginde, Adit A; Blanc, Phillip G; Lieberman, Rebecca M; Camargo, Carlos A

    2008-04-01

    Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3). We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64%) cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8), often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2) identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2%) true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86-92) for detecting hypoglycemia visits. The proposed algorithm improves on prior strategies to identify hypoglycemia visits in administrative data sets and will enhance the ability to study the epidemiology and design interventions for this important complication of diabetes care.

  4. Diabetes Mellitus Coding Training for Family Practice Residents.

    PubMed

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  5. Nuclear shell model code CRUNCHER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resler, D.A.; Grimes, S.M.

    1988-05-01

    A new nuclear shell model code CRUNCHER, patterned after the code VLADIMIR, has been developed. While CRUNCHER and VLADIMIR employ the techniques of an uncoupled basis and the Lanczos process, improvements in the new code allow it to handle much larger problems than the previous code and to perform them more efficiently. Tests involving a moderately sized calculation indicate that CRUNCHER running on a SUN 3/260 workstation requires approximately one-half the central processing unit (CPU) time required by VLADIMIR running on a CRAY-1 supercomputer.

  6. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  7. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach.

    PubMed

    de Lusignan, Simon; Liaw, Siaw-Teng; Michalakidis, Georgios; Jones, Simon

    2011-01-01

    The burden of chronic disease is increasing, and research and quality improvement will be less effective if case finding strategies are suboptimal. To describe an ontology-driven approach to case finding in chronic disease and how this approach can be used to create a data dictionary and make the codes used in case finding transparent. A five-step process: (1) identifying a reference coding system or terminology; (2) using an ontology-driven approach to identify cases; (3) developing metadata that can be used to identify the extracted data; (4) mapping the extracted data to the reference terminology; and (5) creating the data dictionary. Hypertension is presented as an exemplar. A patient with hypertension can be represented by a range of codes including diagnostic, history and administrative. Metadata can link the coding system and data extraction queries to the correct data mapping and translation tool, which then maps it to the equivalent code in the reference terminology. The code extracted, the term, its domain and subdomain, and the name of the data extraction query can then be automatically grouped and published online as a readily searchable data dictionary. An exemplar online is: www.clininf.eu/qickd-data-dictionary.html Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  8. Application of computational fluid dynamics and laminar flow technology for improved performance and sonic boom reduction

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1992-01-01

    A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.

  9. Vectorization of a classical trajectory code on a floating point systems, Inc. Model 164 attached processor.

    PubMed

    Kraus, Wayne A; Wagner, Albert F

    1986-04-01

    A triatomic classical trajectory code has been modified by extensive vectorization of the algorithms to achieve much improved performance on an FPS 164 attached processor. Extensive timings on both the FPS 164 and a VAX 11/780 with floating point accelerator are presented as a function of the number of trajectories simultaneously run. The timing tests involve a potential energy surface of the LEPS variety and trajectories with 1000 time steps. The results indicate that vectorization results in timing improvements on both the VAX and the FPS. For larger numbers of trajectories run simultaneously, up to a factor of 25 improvement in speed occurs between VAX and FPS vectorized code. Copyright © 1986 John Wiley & Sons, Inc.

  10. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1982-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  11. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1984-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  12. CHEETAH: A next generation thermochemical code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, L.; Souers, P.

    1994-11-01

    CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0. We have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. We find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with the new standard run command. CHEETAH will make the use of thermochemical codes more attractivemore » to practical explosive formulators. We have also made an extensive effort to improve over the results of TIGER. CHEETAH`s version of the BKW equation of state (BKWC) is able to accurately reproduce energies from cylinder tests; something that other BKW parameter sets have been unable to do. Calculations performed with BKWC execute very quickly; typical run times are under 10 seconds on a workstation. In the future we plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.« less

  13. Maximum likelihood decoding analysis of Accumulate-Repeat-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    Repeat-Accumulate (RA) codes are the simplest turbo-like codes that achieve good performance. However, they cannot compete with Turbo codes or low-density parity check codes (LDPC) as far as performance is concerned. The Accumulate Repeat Accumulate (ARA) codes, as a subclass of LDPC codes, are obtained by adding a pre-coder in front of RA codes with puncturing where an accumulator is chosen as a precoder. These codes not only are very simple, but also achieve excellent performance with iterative decoding. In this paper, the performance of these codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. The weight distribution of some simple ARA codes is obtained, and through existing tightest bounds we have shown the ML SNR threshold of ARA codes approaches very closely to the performance of random codes. We have shown that the use of precoder improves the SNR threshold but interleaving gain remains unchanged with respect to RA code with puncturing.

  14. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    PubMed Central

    Lin, Kai; Wang, Di; Hu, Long

    2016-01-01

    With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC). The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S) evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods. PMID:27376302

  15. Does incorporation of a clinical support template in the electronic medical record improve capture of wound care data in a cohort of veterans with diabetic foot ulcers?

    PubMed

    Lowe, Jeanne R; Raugi, Gregory J; Reiber, Gayle E; Whitney, Joanne D

    2013-01-01

    The purpose of this cohort study was to evaluate the effect of a 1-year intervention of an electronic medical record wound care template on the completeness of wound care documentation and medical coding compared to a similar time interval for the fiscal year preceding the intervention. From October 1, 2006, to September 30, 2007, a "good wound care" intervention was implemented at a rural Veterans Affairs facility to prevent amputations in veterans with diabetes and foot ulcers. The study protocol included a template with foot ulcer variables embedded in the electronic medical record to facilitate data collection, support clinical decision making, and improve ordering and medical coding. The intervention group showed significant differences in complete documentation of good wound care compared to the historic control group (χ = 15.99, P < .001), complete documentation of coding for diagnoses and procedures (χ = 30.23, P < .001), and complete documentation of both good wound care and coding for diagnoses and procedures (χ = 14.96, P < .001). An electronic wound care template improved documentation of evidence-based interventions and facilitated coding for wound complexity and procedures.

  16. Uniform emergency codes: will they improve safety?

    PubMed

    2005-01-01

    There are pros and cons to uniform code systems, according to emergency medicine experts. Uniformity can be a benefit when ED nurses and other staff work at several facilities. It's critical that your staff understand not only what the codes stand for, but what they must do when codes are called. If your state institutes a new system, be sure to hold regular drills to familiarize your ED staff.

  17. Classification Techniques for Digital Map Compression

    DTIC Science & Technology

    1989-03-01

    classification improved the performance of the K-means classification algorithm resulting in a compression of 8.06:1 with Lempel - Ziv coding. Run-length coding... compression performance are run-length coding [2], [8] and Lempel - Ziv coding 110], [11]. These techniques are chosen because they are most efficient when...investigated. After the classification, some standard file compression methods, such as Lempel - Ziv and run-length encoding were applied to the

  18. Optimal superdense coding over memory channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadman, Z.; Kampermann, H.; Bruss, D.

    2011-10-15

    We study the superdense coding capacity in the presence of quantum channels with correlated noise. We investigate both the cases of unitary and nonunitary encoding. Pauli channels for arbitrary dimensions are treated explicitly. The superdense coding capacity for some special channels and resource states is derived for unitary encoding. We also provide an example of a memory channel where nonunitary encoding leads to an improvement in the superdense coding capacity.

  19. Accuracy and time requirements of a bar-code inventory system for medical supplies.

    PubMed

    Hanson, L B; Weinswig, M H; De Muth, J E

    1988-02-01

    The effects of implementing a bar-code system for issuing medical supplies to nursing units at a university teaching hospital were evaluated. Data on the time required to issue medical supplies to three nursing units at a 480-bed, tertiary-care teaching hospital were collected (1) before the bar-code system was implemented (i.e., when the manual system was in use), (2) one month after implementation, and (3) four months after implementation. At the same times, the accuracy of the central supply perpetual inventory was monitored using 15 selected items. One-way analysis of variance tests were done to determine any significant differences between the bar-code and manual systems. Using the bar-code system took longer than using the manual system because of a significant difference in the time required for order entry into the computer. Multiple-use requirements of the central supply computer system made entering bar-code data a much slower process. There was, however, a significant improvement in the accuracy of the perpetual inventory. Using the bar-code system for issuing medical supplies to the nursing units takes longer than using the manual system. However, the accuracy of the perpetual inventory was significantly improved with the implementation of the bar-code system.

  20. A long-term, integrated impact assessment of alternative building energy code scenarios in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sha; Eom, Jiyong; Evans, Meredydd

    2014-04-01

    China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, ismore » developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.« less

  1. Detecting the borders between coding and non-coding DNA regions in prokaryotes based on recursive segmentation and nucleotide doublets statistics

    PubMed Central

    2012-01-01

    Background Detecting the borders between coding and non-coding regions is an essential step in the genome annotation. And information entropy measures are useful for describing the signals in genome sequence. However, the accuracies of previous methods of finding borders based on entropy segmentation method still need to be improved. Methods In this study, we first applied a new recursive entropic segmentation method on DNA sequences to get preliminary significant cuts. A 22-symbol alphabet is used to capture the differential composition of nucleotide doublets and stop codon patterns along three phases in both DNA strands. This process requires no prior training datasets. Results Comparing with the previous segmentation methods, the experimental results on three bacteria genomes, Rickettsia prowazekii, Borrelia burgdorferi and E.coli, show that our approach improves the accuracy for finding the borders between coding and non-coding regions in DNA sequences. Conclusions This paper presents a new segmentation method in prokaryotes based on Jensen-Rényi divergence with a 22-symbol alphabet. For three bacteria genomes, comparing to A12_JR method, our method raised the accuracy of finding the borders between protein coding and non-coding regions in DNA sequences. PMID:23282225

  2. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  3. Context-aware and locality-constrained coding for image categorization.

    PubMed

    Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun

    2014-01-01

    Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.

  4. Automatic coding and selection of causes of death: an adaptation of Iris software for using in Brazil.

    PubMed

    Martins, Renata Cristófani; Buchalla, Cassia Maria

    2015-01-01

    To prepare a dictionary in Portuguese for using in Iris and to evaluate its completeness for coding causes of death. Iniatially, a dictionary with all illness and injuries was created based on the International Classification of Diseases - tenth revision (ICD-10) codes. This dictionary was based on two sources: the electronic file of ICD-10 volume 1 and the data from Thesaurus of the International Classification of Primary Care (ICPC-2). Then, a death certificate sample from the Program of Improvement of Mortality Information in São Paulo (PRO-AIM) was coded manually and by Iris version V4.0.34, and the causes of death were compared. Whenever Iris was not able to code the causes of death, adjustments were made in the dictionary. Iris was able to code all causes of death in 94.4% death certificates, but only 50.6% were directly coded, without adjustments. Among death certificates that the software was unable to fully code, 89.2% had a diagnosis of external causes (chapter XX of ICD-10). This group of causes of death showed less agreement when comparing the coding by Iris to the manual one. The software performed well, but it needs adjustments and improvement in its dictionary. In the upcoming versions of the software, its developers are trying to solve the external causes of death problem.

  5. Improved accuracy of co-morbidity coding over time after the introduction of ICD-10 administrative data

    PubMed Central

    2011-01-01

    Background Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges. Methods Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities. Results For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven. Conclusions Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system. PMID:21849089

  6. Improved accuracy of co-morbidity coding over time after the introduction of ICD-10 administrative data.

    PubMed

    Januel, Jean-Marie; Luthi, Jean-Christophe; Quan, Hude; Borst, François; Taffé, Patrick; Ghali, William A; Burnand, Bernard

    2011-08-18

    Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges. Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities. For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven. Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system.

  7. Evaluating Coding Accuracy in General Surgery Residents' Accreditation Council for Graduate Medical Education Procedural Case Logs.

    PubMed

    Balla, Fadi; Garwe, Tabitha; Motghare, Prasenjeet; Stamile, Tessa; Kim, Jennifer; Mahnken, Heidi; Lees, Jason

    The Accreditation Council for Graduate Medical Education (ACGME) case log captures resident operative experience based on Current Procedural Terminology (CPT) codes and is used to track operative experience during residency. With increasing emphasis on resident operative experiences, coding is more important than ever. It has been shown in other surgical specialties at similar institutions that the residents' ACGME case log may not accurately reflect their operative experience. What barriers may influence this remains unclear. As the only objective measure of resident operative experience, an accurate case log is paramount in representing one's operative experience. This study aims to determine the accuracy of procedural coding by general surgical residents at a single institution. Data were collected from 2 consecutive graduating classes of surgical residents' ACGME case logs from 2008 to 2014. A total of 5799 entries from 7 residents were collected. The CPT codes entered by residents were compared to departmental billing records submitted by the attending surgeon for each procedure. Assigned CPT codes by institutional American Academy of Professional Coders certified abstract coders were considered the "gold standard." A total of 4356 (75.12%) of 5799 entries were identified in billing records. Excel 2010 and SAS 9.3 were used for analysis. In the event of multiple codes for the same patient, any match between resident codes and billing record codes was considered a "correct" entry. A 4-question survey was distributed to all current general surgical residents at our institution for feedback on coding habits, limitations to accurate coding, and opinions on ACGME case log representation of their operative experience. All 7 residents had a low percentage of correctly entered CPT codes. The overall accuracy proportion for all residents was 52.82% (range: 43.32%-60.07%). Only 1 resident showed significant improvement in accuracy during his/her training (p = 0.0043). The survey response rate was 100%. Survey results indicated that inability to find the precise code within the ACGME search interface and unfamiliarity with available CPT codes were by far the most common perceived barriers to accuracy. Survey results also indicated that most residents (74%) believe that they code accurately most of the time and agree that their case log would accurately represent their operative experience (66.6%). This is the first study to evaluate correctness of residents' ACGME case logs in general surgery. The degree of inaccuracy found here necessitates further investigation into the etiology of these discrepancies. Instruction on coding practices should also benefit the residents after graduation. Optimizing communication among attendings and residents, improving ACGME coding search interface, and implementing consistent coding practices could improve accuracy giving a more realistic view of residents' operative experience. Published by Elsevier Inc.

  8. Performance of MIMO-OFDM using convolution codes with QAM modulation

    NASA Astrophysics Data System (ADS)

    Astawa, I. Gede Puja; Moegiharto, Yoedy; Zainudin, Ahmad; Salim, Imam Dui Agus; Anggraeni, Nur Annisa

    2014-04-01

    Performance of Orthogonal Frequency Division Multiplexing (OFDM) system can be improved by adding channel coding (error correction code) to detect and correct errors that occur during data transmission. One can use the convolution code. This paper present performance of OFDM using Space Time Block Codes (STBC) diversity technique use QAM modulation with code rate ½. The evaluation is done by analyzing the value of Bit Error Rate (BER) vs Energy per Bit to Noise Power Spectral Density Ratio (Eb/No). This scheme is conducted 256 subcarrier which transmits Rayleigh multipath fading channel in OFDM system. To achieve a BER of 10-3 is required 10dB SNR in SISO-OFDM scheme. For 2×2 MIMO-OFDM scheme requires 10 dB to achieve a BER of 10-3. For 4×4 MIMO-OFDM scheme requires 5 dB while adding convolution in a 4x4 MIMO-OFDM can improve performance up to 0 dB to achieve the same BER. This proves the existence of saving power by 3 dB of 4×4 MIMO-OFDM system without coding, power saving 7 dB of 2×2 MIMO-OFDM and significant power savings from SISO-OFDM system.

  9. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  10. Audit of accuracy of clinical coding in oral surgery.

    PubMed

    Naran, S; Hudovsky, A; Antscherl, J; Howells, S; Nouraei, S A R

    2014-10-01

    We aimed to study the accuracy of clinical coding within oral surgery and to identify ways in which it can be improved. We undertook did a multidisciplinary audit of a sample of 646 day case patients who had had oral surgery procedures between 2011 and 2012. We compared the codes given with their case notes and amended any discrepancies. The accuracy of coding was assessed for primary and secondary diagnoses and procedures, and for health resource groupings (HRGs). The financial impact of coding Subjectivity, Variability and Error (SVE) was assessed by reference to national tariffs. The audit resulted in 122 (19%) changes to primary diagnoses. The codes for primary procedures changed in 224 (35%) cases; 310 (48%) morbidities and complications had been missed, and 266 (41%) secondary procedures had been missed or were incorrect. This led to at least one change of coding in 496 (77%) patients, and to the HRG changes in 348 (54%) patients. The financial impact of this was £114 in lost revenue per patient. There is a high incidence of coding errors in oral surgery because of the large number of day cases, a lack of awareness by clinicians of coding issues, and because clinical coders are not always familiar with the large number of highly specialised abbreviations used. Accuracy of coding can be improved through the use of a well-designed proforma, and standards can be maintained by the use of an ongoing data quality assurance programme. Copyright © 2014. Published by Elsevier Ltd.

  11. Optimization and parallelization of the thermal–hydraulic subchannel code CTF for high-fidelity multi-physics applications

    DOE PAGES

    Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.

    2014-11-23

    This study describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis.

  12. Transformation of two and three-dimensional regions by elliptic systems

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne

    1993-01-01

    During this contract period, our work has focused on improvements to elliptic grid generation methods. There are two principle objectives in this project. One objective is to make the elliptic methods more reliable and efficient, and the other is to construct a modular code that can be incorporated into the National Grid Project (NGP), or any other grid generation code. Progress has been made in meeting both of these objectives. The two objectives are actually complementary. As the code development for the NGP progresses, we see many areas where improvements in algorithms can be made.

  13. In-network Coding for Resilient Sensor Data Storage and Efficient Data Mule Collection

    NASA Astrophysics Data System (ADS)

    Albano, Michele; Gao, Jie

    In a sensor network of n nodes in which k of them have sensed interesting data, we perform in-network erasure coding such that each node stores a linear combination of all the network data with random coefficients. This scheme greatly improves data resilience to node failures: as long as there are k nodes that survive an attack, all the data produced in the sensor network can be recovered with high probability. The in-network coding storage scheme also improves data collection rate by mobile mules and allows for easy scheduling of data mules.

  14. A modified non-binary LDPC scheme based on watermark symbols in high speed optical transmission systems

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Qiao, Yaojun; Yu, Qian; Zhang, Wenbo

    2016-04-01

    We introduce a watermark non-binary low-density parity check code (NB-LDPC) scheme, which can estimate the time-varying noise variance by using prior information of watermark symbols, to improve the performance of NB-LDPC codes. And compared with the prior-art counterpart, the watermark scheme can bring about 0.25 dB improvement in net coding gain (NCG) at bit error rate (BER) of 1e-6 and 36.8-81% reduction of the iteration numbers. Obviously, the proposed scheme shows great potential in terms of error correction performance and decoding efficiency.

  15. [The design and experiment of complementary S coding matrix based on digital micromirror spectrometer].

    PubMed

    Zhang, Zhi-Hai; Gao, Ling-Xiao; Guo, Yuan-Jun; Wang, Wei; Mo, Xiang-Xia

    2012-12-01

    The template selection is essential in the application of digital micromirror spectrometer. The best theoretical coding H-matrix is not widely used due to acyclic, complex coding and difficult achievement. The noise ratio of best practical S-matrix for improvement is slightly inferior to matrix H. So we designed a new type complementary S-matrix. Through studying its noise improvement theory, the algorithm is proved to have the advantages of both H-matrix and S-matrix. The experiments proved that the SNR can be increased 2.05 times than S-template.

  16. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  17. REASSESSING MECHANISM AS A PREDICTOR OF PEDIATRIC INJURY MORTALITY

    PubMed Central

    Beck, Haley; Mittal, Sushil; Madigan, David; Burd, Randall S.

    2015-01-01

    Background The use of mechanism of injury as a predictor of injury outcome presents practical challenges because this variable may be missing or inaccurate in many databases. The purpose of this study was to determine the importance of mechanism of injury as a predictor of mortality among injured children. Methods The records of children (<15 years old) sustaining a blunt injury were obtained from the National Trauma Data Bank. Models predicting injury mortality were developed using mechanism of injury and injury coding using either Abbreviated Injury Scale post-dot values (low-dimensional injury coding) or injury ICD-9 codes and their two-way interactions (high-dimensional injury coding). Model performance with and without inclusion of mechanism of injury was compared for both coding schemes, and the relative importance of mechanism of injury as a variable in each model type was evaluated. Results Among 62,569 records, a mortality rate of 0.9% was observed. Inclusion of mechanism of injury improved model performance when using low-dimensional injury coding but was associated with no improvement when using high-dimensional injury coding. Mechanism of injury contributed to 28% of model variance when using low-dimensional injury coding and <1% when high-dimensional injury coding was used. Conclusions Although mechanism of injury may be an important predictor of injury mortality among children sustaining blunt trauma, its importance as a predictor of mortality depends on approach used for injury coding. Mechanism of injury is not an essential predictor of outcome after injury when coding schemes are used that better characterize injuries sustained after blunt pediatric trauma. PMID:26197948

  18. Transversal Clifford gates on folded surface codes

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less

  19. Computer-Based Learning of Spelling Skills in Children with and without Dyslexia

    ERIC Educational Resources Information Center

    Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jancke, Lutz; Meyer, Martin

    2011-01-01

    Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based…

  20. Color and Grey Scale in Sonar Displays

    NASA Technical Reports Server (NTRS)

    Kraiss, K. F.; Kuettelwesch, K. H.

    1984-01-01

    In spite of numerous publications 1 it is still rather unclear, whether color is of any help in sonar displays. The work presented here deals with a particular type of sonar data, i.e., LOFAR-grams (low frequency analysing and recording) where acoustic sensor data are continuously written as a time-frequency plot. The question to be answered quantitatively is, whether color coding does improve target detection when compared with a grey scale code. The data show significant differences in receiver-operating characteristics performance for the selected codes. In addition it turned out, that the background noise level affects the performance dramatically for some color codes, while others remain stable or even improve. Generally valid rules are presented on how to generate useful color scales for this particular application.

Top