Sample records for diagnostic code density

  1. Synthetic Microwave Imaging Reflectometry diagnostic using 3D FDTD Simulations

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Jenkins, Thomas; Smithe, David; King, Jacob; Nimrod Team Team

    2017-10-01

    Microwave Imaging Reflectometry (MIR) has become a standard diagnostic for understanding tokamak edge perturbations, including the edge harmonic oscillations in QH mode operation. These long-wavelength perturbations are larger than the normal turbulent fluctuation levels and thus normal analysis of synthetic signals become more difficult. To investigate, we construct a synthetic MIR diagnostic for exploring density fluctuation amplitudes in the tokamak plasma edge by using the three-dimensional, full-wave FDTD code Vorpal. The source microwave beam for the diagnostic is generated and refelected at the cutoff surface that is distorted by 2D density fluctuations in the edge plasma. Synthetic imaging optics at the detector can be used to understand the fluctuation and background density profiles. We apply the diagnostic to understand the fluctuations in edge plasma density during QH-mode activity in the DIII-D tokamak, as modeled by the NIMROD code. This work was funded under DOE Grant Number DE-FC02-08ER54972.

  2. Measurement of Atmospheric Pressure Air Plasma via Pulsed Electron Beam and Sustaining Electric Field

    DTIC Science & Technology

    2007-08-29

    cell plasma code ( MAGIC ) and an air-chemistry code are used to quantify beam propagation through an electron-beam transmission window into air and the...to generate and maintain plasma in air on the timescale of 1 ms. 15. SUBJECT TERMS Air Chemistry, Air Plasma, MAGIC Modeling, Plasma, Power, Test-Cell...Microwave diagnostics quantify electron number density and optical diagnostics quantify ozone production. A particle in cell plasma code ( MAGIC ) and an

  3. Synthetic Diagnostics Platform for Fusion Plasma and a Two-Dimensional Synthetic Electron Cyclotron Emission Imaging Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Lei

    Magnetic confinement fusion is one of the most promising approaches to achieve fusion energy. With the rapid increase of the computational power over the past decades, numerical simulation have become an important tool to study the fusion plasmas. Eventually, the numerical models will be used to predict the performance of future devices, such as the International Thermonuclear Experiment Reactor (ITER) or DEMO. However, the reliability of these models needs to be carefully validated against experiments before the results can be trusted. The validation between simulations and measurements is hard particularly because the quantities directly available from both sides are different.more » While the simulations have the information of the plasma quantities calculated explicitly, the measurements are usually in forms of diagnostic signals. The traditional way of making the comparison relies on the diagnosticians to interpret the measured signals as plasma quantities. The interpretation is in general very complicated and sometimes not even unique. In contrast, given the plasma quantities from the plasma simulations, we can unambiguously calculate the generation and propagation of the diagnostic signals. These calculations are called synthetic diagnostics, and they enable an alternate way to compare the simulation results with the measurements. In this dissertation, we present a platform for developing and applying synthetic diagnostic codes. Three diagnostics on the platform are introduced. The reflectometry and beam emission spectroscopy diagnostics measure the electron density, and the electron cyclotron emission diagnostic measures the electron temperature. The theoretical derivation and numerical implementation of a new two dimensional Electron cyclotron Emission Imaging code is discussed in detail. This new code has shown the potential to address many challenging aspects of the present ECE measurements, such as runaway electron effects, and detection of the cross phase between the electron temperature and density fluctuations.« less

  4. A correlation electron cyclotron emission diagnostic and the importance of multifield fluctuation measurements for testing nonlinear gyrokinetic turbulence simulations.

    PubMed

    White, A E; Schmitz, L; Peebles, W A; Carter, T A; Rhodes, T L; Doyle, E J; Gourdain, P A; Hillesheim, J C; Wang, G; Holland, C; Tynan, G R; Austin, M E; McKee, G R; Shafer, M W; Burrell, K H; Candy, J; DeBoo, J C; Prater, R; Staebler, G M; Waltz, R E; Makowski, M A

    2008-10-01

    A correlation electron cyclotron emission (CECE) diagnostic has been used to measure local, turbulent fluctuations of the electron temperature in the core of DIII-D plasmas. This paper describes the hardware and testing of the CECE diagnostic and highlights the importance of measurements of multifield fluctuation profiles for the testing and validation of nonlinear gyrokinetic codes. The process of testing and validating such codes is critical for extrapolation to next-step fusion devices. For the first time, the radial profiles of electron temperature and density fluctuations are compared to nonlinear gyrokinetic simulations. The CECE diagnostic at DIII-D uses correlation radiometry to measure the rms amplitude and spectrum of the electron temperature fluctuations. Gaussian optics are used to produce a poloidal spot size with w(o) approximately 1.75 cm in the plasma. The intermediate frequency filters and the natural linewidth of the EC emission determine the radial resolution of the CECE diagnostic, which can be less than 1 cm. Wavenumbers resolved by the CECE diagnostic are k(theta) < or = 1.8 cm(-1) and k(r) < or = 4 cm(-1), relevant for studies of long-wavelength turbulence associated with the trapped electron mode and the ion temperature gradient mode. In neutral beam heated L-mode plasmas, core electron temperature fluctuations in the region 0.5 < r/a < 0.9, increase with radius from approximately 0.5% to approximately 2%, similar to density fluctuations that are measured simultaneously with beam emission spectroscopy. After incorporating "synthetic diagnostics" to effectively filter the code output, the simulations reproduce the characteristics of the turbulence and transport at one radial location r/a = 0.5, but not at a second location, r/a = 0.75. These results illustrate that measurements of the profiles of multiple fluctuating fields can provide a significant constraint on the turbulence models employed by the code.

  5. Implementation of a multichannel soft x-ray diagnostic for electron temperature measurements in TJ-II high-density plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baiao, D.; Varandas, C.; Medina, F.

    2012-10-15

    Based on the multi-foil technique, a multichannel soft x-ray diagnostic for electron temperature measurements has been recently implemented in the TJ-II stellarator. The diagnostic system is composed by four photodiodes arrays with beryllium filters of different thickness. An in-vacuum amplifier board is coupled to each array, aiming at preventing induced noise currents. The Thomson scattering and the vacuum ultraviolet survey diagnostics are used for assessing plasma profiles and composition, being the analysis carried out with the radiation code IONEQ. The electron temperature is determined through the different signal-pair ratios with temporal and spatial resolution. The design and preliminary results frommore » the diagnostic are presented.« less

  6. Multi-frequency ICRF diagnostic of Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Lafonteese, David James

    This thesis explores the diagnostic possibilities of a fast wave-based method for measuring the ion density and temperature profiles of tokamak plasmas. In these studies fast waves are coupled to the plasma at frequencies at the second harmonic of the ion gyrofrequency, at which wave energy is absorbed by the finite-temperature ions. As the ion gyrofrequency is dependent upon the local magnetic field, which varies as l/R in a tokamak, this power absorption is radially localized. The simultaneous launching of multiple frequencies, all resonating at different plasma positions, allows local measurements of the ion density and temperature. To investigate the profile applications of wave damping measurements in a simulated tokamak, an inhouse slab-model ICRF code is developed. A variety of analysis methods are presented, and ion density and temperature profiles are reconstructed for hydrogen plasmas for the Electric Tokamak (ET) and ITER parameter spaces. These methods achieve promising results in simulated plasmas featuring bulk ion heating, off-axis RF heating, and density ramps. The experimental results of similar studies on the Electric Tokamak, a high aspect ratio (R/a = 5), low toroidal field (2.2 kG) device are then presented. In these studies, six fast wave frequencies were coupled using a single-strap, low-field-side antenna to ET plasmas. The frequencies were variable, and could be tuned to resonate at different radii for different experiments. Four magnetic pickup loops were used to measure of the toroidal component of the wave magnetic field. The expected greater eigenmode damping of center-resonant frequencies versus edge-resonant frequencies is consistently observed. Comparison of measured aspects of fast wave behavior in ET is made with the slab code predictions, which validate the code simulations under weakly-damped conditions. A density profile is measured for an ET discharge through analysis of the fast wave measurements, and is compared to an electron density profile derived from Thomson scattering data. The methodology behind a similar measurement of the ion temperature profile is also presented.

  7. Measurements of confined alphas and tritons in the MHD quiescent core of TFTR plasmas using the pellet charge exchange diagnostic

    NASA Astrophysics Data System (ADS)

    Medley, S. S.; Budny, R. V.; Mansfield, D. K.; Redi, M. H.; Roquemore, A. L.; Fisher, R. K.; Duong, H. H.; McChesney, J. M.; Parks, P. B.; Petrov, M. P.; Gorelenkov, N. N.

    1996-10-01

    The energy distributions and radial density profiles of the fast confined trapped alpha particles in DT experiments on TFTR are being measured in the energy range 0.5 - 3.5 MeV using the pellet charge exchange (PCX) diagnostic. A brief description of the measurement technique which involves active neutral particle analysis using the ablation cloud surrounding an injected impurity pellet as the neutralizer is presented. This paper focuses on alpha and triton measurements in the core of MHD quiescent TFTR discharges where the expected classical slowing-down and pitch angle scattering effects are not complicated by stochastic ripple diffusion and sawtooth activity. In particular, the first measurement of the alpha slowing-down distribution up to the birth energy, obtained using boron pellet injection, is presented. The measurements are compared with predictions using either the TRANSP Monte Carlo code and/or a Fokker - Planck Post-TRANSP processor code, which assumes that the alphas and tritons are well confined and slow down classically. Both the shape of the measured alpha and triton energy distributions and their density ratios are in good agreement with the code calculations. We can conclude that the PCX measurements are consistent with classical thermalization of the fusion-generated alphas and tritons.

  8. AG Dra -- a high density plasma laboratory

    NASA Astrophysics Data System (ADS)

    Young, Peter

    2002-07-01

    A STIS observation of the symbiotic star AG Draconis yielding spectra in the range 1150--10 000 Angstrom is requested. AG Dra is a non-eclipsing binary that shows strong, narrow nebular emission lines that originate in the wind of a K giant, photoionized by a hot white dwarf. The density of the nebula is around 10^10 electrons/cm^3 and is the perfect laboratory for testing the plasma modeling codes cloudy and xstar at high densities. These codes are used for a wide range of astrophysical objects including stellar winds, accretion disks, active galactic nuclei and Seyfert galaxies, and calibrating them against high signal-to-noise spectra from comparatively simple systems is essential. AG Dra is the perfect high density laboratory for this work. In addition, many previously undetected emission lines will be found through the high sensitivity of STIS, which will allow new plasma diagnostics to be tested. These twin objectives are particularly pertinent as the high sensitivity of emphHST/COS will will permit similar high resolution spectroscopy to be applied to a whole new regime of extragalactic objects. By combining far-UV data from Ause with complementary data from STIS, we will determine ratios of emission lines from the same ion, or ions of similar ionization level. These will permit a more complete set of diagnostics than are obtainable from one instrument alone.

  9. Neutral helium beam probe

    NASA Astrophysics Data System (ADS)

    Karim, Rezwanul

    1999-10-01

    This article discusses the development of a code where diagnostic neutral helium beam can be used as a probe. The code solves numerically the evolution of the population densities of helium atoms at their several different energy levels as the beam propagates through the plasma. The collisional radiative model has been utilized in this numerical calculation. The spatial dependence of the metastable states of neutral helium atom, as obtained in this numerical analysis, offers a possible diagnostic tool for tokamak plasma. The spatial evolution for several hypothetical plasma conditions was tested. Simulation routines were also run with the plasma parameters (density and temperature profiles) similar to a shot in the Princeton beta experiment modified (PBX-M) tokamak and a shot in Tokamak Fusion Test Reactor tokamak. A comparison between the simulation result and the experimentally obtained data (for each of these two shots) is presented. A good correlation in such comparisons for a number of such shots can establish the accurateness and usefulness of this probe. The result can possibly be extended for other plasma machines and for various plasma conditions in those machines.

  10. Measurement of deuterium density profiles in the H-mode steep gradient region using charge exchange recombination spectroscopy on DIII-D

    DOE PAGES

    Haskey, S. R.; Grierson, B. A.; Burrell, K. H.; ...

    2016-09-26

    Recent completion of a thirty two channel main-ion (deuterium) charge exchange recombination spectroscopy (CER) diagnostic on the DIII-D tokamak enables detailed comparisons between impurity and main-ion temperature, density, and toroidal rotation. In a H-mode DIII-D discharge, these new measurement capabilities are used to provide the deuterium density profile, demonstrate the importance of profile alignment between Thomson scattering and CER diagnostics, and aid in determining the electron temperature at the separatrix. Sixteen sightlines cover the core of the plasma and another sixteen are densely packed towards the plasma edge, providing high resolution measurements across the pedestal and steep gradient region inmore » H-mode plasmas. Extracting useful physical quantities such as deuterium density is challenging due to multiple photoemission processes. Finally, these challenges are overcome using a detailed fitting model and by forward modeling the photoemission using the FIDASIM code, which implements a comprehensive collisional radiative model. Published by AIP Publishing.« less

  11. Measurement of deuterium density profiles in the H-mode steep gradient region using charge exchange recombination spectroscopy on DIII-D.

    PubMed

    Haskey, S R; Grierson, B A; Burrell, K H; Chrystal, C; Groebner, R J; Kaplan, D H; Pablant, N A; Stagner, L

    2016-11-01

    Recent completion of a thirty two channel main-ion (deuterium) charge exchange recombination spectroscopy (CER) diagnostic on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)] enables detailed comparisons between impurity and main-ion temperature, density, and toroidal rotation. In a H-mode DIII-D discharge, these new measurement capabilities are used to provide the deuterium density profile, demonstrate the importance of profile alignment between Thomson scattering and CER diagnostics, and aid in determining the electron temperature at the separatrix. Sixteen sightlines cover the core of the plasma and another sixteen are densely packed towards the plasma edge, providing high resolution measurements across the pedestal and steep gradient region in H-mode plasmas. Extracting useful physical quantities such as deuterium density is challenging due to multiple photoemission processes. These challenges are overcome using a detailed fitting model and by forward modeling the photoemission using the FIDASIM code, which implements a comprehensive collisional radiative model.

  12. Measurement of deuterium density profiles in the H-mode steep gradient region using charge exchange recombination spectroscopy on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haskey, S. R.; Grierson, B. A.; Burrell, K. H.

    Recent completion of a thirty two channel main-ion (deuterium) charge exchange recombination spectroscopy (CER) diagnostic on the DIII-D tokamak enables detailed comparisons between impurity and main-ion temperature, density, and toroidal rotation. In a H-mode DIII-D discharge, these new measurement capabilities are used to provide the deuterium density profile, demonstrate the importance of profile alignment between Thomson scattering and CER diagnostics, and aid in determining the electron temperature at the separatrix. Sixteen sightlines cover the core of the plasma and another sixteen are densely packed towards the plasma edge, providing high resolution measurements across the pedestal and steep gradient region inmore » H-mode plasmas. Extracting useful physical quantities such as deuterium density is challenging due to multiple photoemission processes. Finally, these challenges are overcome using a detailed fitting model and by forward modeling the photoemission using the FIDASIM code, which implements a comprehensive collisional radiative model. Published by AIP Publishing.« less

  13. Measurement of deuterium density profiles in the H-mode steep gradient region using charge exchange recombination spectroscopy on DIII-D

    NASA Astrophysics Data System (ADS)

    Haskey, S. R.; Grierson, B. A.; Burrell, K. H.; Chrystal, C.; Groebner, R. J.; Kaplan, D. H.; Pablant, N. A.; Stagner, L.

    2016-11-01

    Recent completion of a thirty two channel main-ion (deuterium) charge exchange recombination spectroscopy (CER) diagnostic on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)] enables detailed comparisons between impurity and main-ion temperature, density, and toroidal rotation. In a H-mode DIII-D discharge, these new measurement capabilities are used to provide the deuterium density profile, demonstrate the importance of profile alignment between Thomson scattering and CER diagnostics, and aid in determining the electron temperature at the separatrix. Sixteen sightlines cover the core of the plasma and another sixteen are densely packed towards the plasma edge, providing high resolution measurements across the pedestal and steep gradient region in H-mode plasmas. Extracting useful physical quantities such as deuterium density is challenging due to multiple photoemission processes. These challenges are overcome using a detailed fitting model and by forward modeling the photoemission using the FIDASIM code, which implements a comprehensive collisional radiative model.

  14. Benchmark of 3D halo neutral simulation in TRANSP and FIDASIM and application to projected neutral-beam-heated NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Liu, D.; Medley, S. S.; Gorelenkova, M. V.; Heidbrink, W. W.; Stagner, L.

    2014-10-01

    A cloud of halo neutrals is created in the vicinity of beam footprint during the neutral beam injection and the halo neutral density can be comparable with beam neutral density. Proper modeling of halo neutrals is critical to correctly interpret neutral particle analyzers (NPA) and fast ion D-alpha (FIDA) signals since these signals strongly depend on local beam and halo neutral density. A 3D halo neutral model has been recently developed and implemented inside TRANSP code. The 3D halo neutral code uses a ``beam-in-a-box'' model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components produce thermal halo neutrals that are tracked through successive halo neutral generations until an ionization event occurs or a descendant halo exits the box. A benchmark between 3D halo neural model in TRANSP and in FIDA/NPA synthetic diagnostic code FIDASIM is carried out. Detailed comparison of halo neutral density profiles from two codes will be shown. The NPA and FIDA simulations with and without 3D halos are applied to projections of plasma performance for the National Spherical Tours eXperiment-Upgrade (NSTX-U) and the effects of halo neutral density on NPA and FIDA signal amplitude and profile will be presented. Work supported by US DOE.

  15. Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)

    NASA Astrophysics Data System (ADS)

    Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.

    2017-10-01

    Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.

  16. 38 CFR 4.27 - Use of diagnostic code numbers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Use of diagnostic code... FOR RATING DISABILITIES General Policy in Rating § 4.27 Use of diagnostic code numbers. The diagnostic... residual condition is encountered, requiring rating by analogy, the diagnostic code number will be “built...

  17. 38 CFR 4.27 - Use of diagnostic code numbers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Use of diagnostic code... FOR RATING DISABILITIES General Policy in Rating § 4.27 Use of diagnostic code numbers. The diagnostic... residual condition is encountered, requiring rating by analogy, the diagnostic code number will be “built...

  18. High-fidelity plasma codes for burn physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, James; Graziani, Frank; Marinak, Marty

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less

  19. Local variations in the timing of RSV epidemics.

    PubMed

    Noveroske, Douglas B; Warren, Joshua L; Pitzer, Virginia E; Weinberger, Daniel M

    2016-11-11

    Respiratory syncytial virus (RSV) is a primary cause of hospitalizations in children worldwide. The timing of seasonal RSV epidemics needs to be known in order to administer prophylaxis to high-risk infants at the appropriate time. We used data from the Connecticut State Inpatient Database to identify RSV hospitalizations based on ICD-9 diagnostic codes. Harmonic regression analyses were used to evaluate RSV epidemic timing at the county level and ZIP code levels. Linear regression was used to investigate associations between the socioeconomic status of a locality and RSV epidemic timing. 9,740 hospitalizations coded as RSV occurred among children less than 2 years old between July 1, 1997 and June 30, 2013. The earliest ZIP code had a seasonal RSV epidemic that peaked, on average, 4.64 weeks earlier than the latest ZIP code. Earlier epidemic timing was significantly associated with demographic characteristics (higher population density and larger fraction of the population that was black). Seasonal RSV epidemics in Connecticut occurred earlier in areas that were more urban (higher population density and larger fraction of the population that was). These findings could be used to better time the administration of prophylaxis to high-risk infants.

  20. Status of Artist Upgrade

    DTIC Science & Technology

    1988-09-01

    Autodrift, ARTIST Autoscaling , Electron Density 16. PRICE CODE Profiles 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY...FIGURES Figure No. Page 2.1 ARTIST Scaled Parameters 4 2.2 ARTIST ASCII Ionogram 6 2.3 ARTISTSV Optifont lonogram 7 2.4 Autoscaling of Es Trace Before...diagnostic programs for testing communication ports. The aforementioned contract required a performance evaluation of ARTIST . Manual and autoscaled

  1. Enormous knowledge base of disease diagnosis criteria.

    PubMed

    Xiao, Z H; Xiao, Y H; Pei, J H

    1995-01-01

    One of the problems in the development of the medical knowledge systems is the limitations of the system's knowledge. It is a common expectation to increase the number of diseases contained in a system. Using a high density knowledge representation method designed by us, we have developed the Enormous Knowledge Base of Disease Diagnosis Criteria (EKBDDC). It contains diagnostic criteria of 1,001 diagnostic entities and describes nearly 4,000 items of diagnostic indicators. It is the core of a huge medical project--the Electronic-Brain Medical Erudite (EBME). This enormous knowledge base was implemented initially on a low-cost popular microcomputer, which can aid in the prompting of typical disease and in teaching of diagnosis. The knowledge base is easy to expand. One of the main goals of EKBDDC is to increase the number of diseases included in it as far as possible using a low-cost computer with a comparatively small storage capacity. For this, we have designed a high density knowledge representation method. Criteria of various diagnostic entities are respectively stored in different records of the knowledge base. Each diagnostic entity corresponds to a diagnostic criterion data set; each data set consists of some diagnostic criterion data values (Table 1); each data is composed of two parts: integer and decimal; the integral part is the coding number of the given diagnostic information, and the decimal part is the diagnostic value of this information to the disease indicated by corresponding record number. For example, 75.02: the integer 75 is the coding number of "hemorrhagic skin rash"; the decimal 0.02 is the diagnostic value of this manifestation for diagnosing allergic purpura. TABULAR DATA, SEE PUBLISHED ABSTRACT. The algebraic sum method, a special form of the weighted summation, is adopted as mathematical model. In EKBDDC, the diagnostic values, which represent the significance of the disease manifestations for diagnosing corresponding diseases, were determined empirically. It is of a great economical, practical, and technical significance to realize enormous knowledge bases of disease diagnosis criteria on a low-cost popular microcomputer. This is beneficial for the developing countries to popularize medical informatics. To create the enormous international computer-aided diagnosis system, one may jointly develop the unified modules of disease diagnosis criteria used to "inlay" relevant computer-aided diagnosis systems. It is just like assembling a house using prefabricated panels.

  2. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  3. Environmental durability diagnostic for printed identification codes of polymer insulation for distribution pipelines

    NASA Astrophysics Data System (ADS)

    Zhuravleva, G. N.; Nagornova, I. V.; Kondratov, A. P.; Bablyuk, E. B.; Varepo, L. G.

    2017-08-01

    A research and modelling of weatherability and environmental durability of multilayer polymer insulation of both cable and pipelines with printed barcodes or color identification information were performed. It was proved that interlayer printing of identification codes in distribution pipelines insulation coatings provides high marking stability to light and atmospheric condensation. This allows to carry out their distant damage control. However, microbiological fouling of upper polymer layer hampers the distant damage pipelines identification. The color difference values and density changes of PE and PVC printed insolation due to weather and biological factors were defined.

  4. Modelling of Divertor Detachment in MAST Upgrade

    NASA Astrophysics Data System (ADS)

    Moulton, David; Carr, Matthew; Harrison, James; Meakins, Alex

    2017-10-01

    MAST Upgrade will have extensive capabilities to explore the benefits of alternative divertor configurations such as the conventional, Super-X, x divertor, snowflake and variants in a single device with closed divertors. Initial experiments will concentrate on exploring the Super-X and conventional configurations, in terms of power and particle loads to divertor surfaces, access to detachment and its control. Simulations have been carried out with the SOLPS5.0 code validated against MAST experiments. The simulations predict that the Super-X configuration has significant advantages over the conventional, such as lower detachment threshold (2-3x lower in terms of upstream density and 4x higher in terms of PSOL). Synthetic spectroscopy diagnostics from these simulations have been created using the Raysect ray tracing code to produce synthetic filtered camera images, spectra and foil bolometer data. Forward modelling of the current set of divertor diagnostics will be presented, together with a discussion of future diagnostics and analysis to improve estimates of the plasma conditions. Work supported by the RCUK Energy Programme [Grant Number EP/P012450/1] and EURATOM.

  5. ALCBEAM - Neutral beam formation and propagation code for beam-based plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Bespamyatnov, I. O.; Rowan, W. L.; Liao, K. T.

    2012-03-01

    ALCBEAM is a new three-dimensional neutral beam formation and propagation code. It was developed to support the beam-based diagnostics installed on the Alcator C-Mod tokamak. The purpose of the code is to provide reliable estimates of the local beam equilibrium parameters: such as beam energy fractions, density profiles and excitation populations. The code effectively unifies the ion beam formation, extraction and neutralization processes with beam attenuation and excitation in plasma and neutral gas and beam stopping by the beam apertures. This paper describes the physical processes interpreted and utilized by the code, along with exploited computational methods. The description is concluded by an example simulation of beam penetration into plasma of Alcator C-Mod. The code is successfully being used in Alcator C-Mod tokamak and expected to be valuable in the support of beam-based diagnostics in most other tokamak environments. Program summaryProgram title: ALCBEAM Catalogue identifier: AEKU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 66 459 No. of bytes in distributed program, including test data, etc.: 7 841 051 Distribution format: tar.gz Programming language: IDL Computer: Workstation, PC Operating system: Linux RAM: 1 GB Classification: 19.2 Nature of problem: Neutral beams are commonly used to heat and/or diagnose high-temperature magnetically-confined laboratory plasmas. An accurate neutral beam characterization is required for beam-based measurements of plasma properties. Beam parameters such as density distribution, energy composition, and atomic excited populations of the beam atoms need to be known. Solution method: A neutral beam is initially formed as an ion beam which is extracted from the ion source by high voltage applied to the extraction and accelerating grids. The current distribution of a single beamlet emitted from a single pore of IOS depends on the shape of the plasma boundary in the emission region. Total beam extracted by IOS is calculated at every point of 3D mesh as sum of all contributions from each grid pore. The code effectively unifies the ion beam formation, extraction and neutralization processes with neutral beam attenuation and excitation in plasma and neutral gas and beam stopping by the beam apertures. Running time: 10 min for a standard run.

  6. Screening pregnant women for suicidal behavior in electronic medical records: diagnostic codes vs. clinical notes processed by natural language processing.

    PubMed

    Zhong, Qiu-Yue; Karlson, Elizabeth W; Gelaye, Bizu; Finan, Sean; Avillach, Paul; Smoller, Jordan W; Cai, Tianxi; Williams, Michelle A

    2018-05-29

    We examined the comparative performance of structured, diagnostic codes vs. natural language processing (NLP) of unstructured text for screening suicidal behavior among pregnant women in electronic medical records (EMRs). Women aged 10-64 years with at least one diagnostic code related to pregnancy or delivery (N = 275,843) from Partners HealthCare were included as our "datamart." Diagnostic codes related to suicidal behavior were applied to the datamart to screen women for suicidal behavior. Among women without any diagnostic codes related to suicidal behavior (n = 273,410), 5880 women were randomly sampled, of whom 1120 had at least one mention of terms related to suicidal behavior in clinical notes. NLP was then used to process clinical notes for the 1120 women. Chart reviews were performed for subsamples of women. Using diagnostic codes, 196 pregnant women were screened positive for suicidal behavior, among whom 149 (76%) had confirmed suicidal behavior by chart review. Using NLP among those without diagnostic codes, 486 pregnant women were screened positive for suicidal behavior, among whom 146 (30%) had confirmed suicidal behavior by chart review. The use of NLP substantially improves the sensitivity of screening suicidal behavior in EMRs. However, the prevalence of confirmed suicidal behavior was lower among women who did not have diagnostic codes for suicidal behavior but screened positive by NLP. NLP should be used together with diagnostic codes for future EMR-based phenotyping studies for suicidal behavior.

  7. Development of a GPU-Accelerated 3-D Full-Wave Code for Electromagnetic Wave Propagation in a Cold Plasma

    NASA Astrophysics Data System (ADS)

    Woodbury, D.; Kubota, S.; Johnson, I.

    2014-10-01

    Computer simulations of electromagnetic wave propagation in magnetized plasmas are an important tool for both plasma heating and diagnostics. For active millimeter-wave and microwave diagnostics, accurately modeling the evolution of the beam parameters for launched, reflected or scattered waves in a toroidal plasma requires that calculations be done using the full 3-D geometry. Previously, we reported on the application of GPGPU (General-Purpose computing on Graphics Processing Units) to a 3-D vacuum Maxwell code using the FDTD (Finite-Difference Time-Domain) method. Tests were done for Gaussian beam propagation with a hard source antenna, utilizing the parallel processing capabilities of the NVIDIA K20M. In the current study, we have modified the 3-D code to include a soft source antenna and an induced current density based on the cold plasma approximation. Results from Gaussian beam propagation in an inhomogeneous anisotropic plasma, along with comparisons to ray- and beam-tracing calculations will be presented. Additional enhancements, such as advanced coding techniques for improved speedup, will also be investigated. Supported by U.S. DoE Grant DE-FG02-99-ER54527 and in part by the U.S. DoE, Office of Science, WDTS under the Science Undergraduate Laboratory Internship program.

  8. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    NASA Astrophysics Data System (ADS)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that this input can be provided reliably by the NINJA code.

  9. Density diagnostics of ionized outflows in active galacitc nuclei

    NASA Astrophysics Data System (ADS)

    Mao, J.; Kaastra, J.; Mehdipour, M.; Raassen, T.; Gu, L.

    2017-10-01

    Ionized outflows in Active Galactic Nuclei are thought to influence their nuclear and local galactic environment. However, the distance of outflows with respect to the central engine is poorly constrained, which limits our understanding of the kinetic power by the outflows. Therefore, the impact of AGN outflows on their host galaxies is uncertain. Given the density of the outflows, their distance can be immediately obtained by the definition of the ionization parameter. Here we carry out a theoretical study of density diagnostics of AGN outflows using absorption lines from metastable levels in Be-like to F-like ions. With the new self-consistent photoionization model (PION) in the SPEX code, we are able to calculate ground and metastable level populations. This enable us to determine under what physical conditions these levels are significantly populated. We then identify characteristic transitions from these metastable levels in the X-ray band. Firm detections of absorption lines from such metastable levels are challenging for current grating instruments. The next generation of spectrometers like X-IFU onboard Athena will certainly identify the presence/absence of these density- sensitive absorption lines, thus tightly constraining the location and the kinetic power of AGN outflows.

  10. Validation of Diagnostic Groups Based on Health Care Utilization Data Should Adjust for Sampling Strategy.

    PubMed

    Cadieux, Geneviève; Tamblyn, Robyn; Buckeridge, David L; Dendukuri, Nandini

    2017-08-01

    Valid measurement of outcomes such as disease prevalence using health care utilization data is fundamental to the implementation of a "learning health system." Definitions of such outcomes can be complex, based on multiple diagnostic codes. The literature on validating such data demonstrates a lack of awareness of the need for a stratified sampling design and corresponding statistical methods. We propose a method for validating the measurement of diagnostic groups that have: (1) different prevalences of diagnostic codes within the group; and (2) low prevalence. We describe an estimation method whereby: (1) low-prevalence diagnostic codes are oversampled, and the positive predictive value (PPV) of the diagnostic group is estimated as a weighted average of the PPV of each diagnostic code; and (2) claims that fall within a low-prevalence diagnostic group are oversampled relative to claims that are not, and bias-adjusted estimators of sensitivity and specificity are generated. We illustrate our proposed method using an example from population health surveillance in which diagnostic groups are applied to physician claims to identify cases of acute respiratory illness. Failure to account for the prevalence of each diagnostic code within a diagnostic group leads to the underestimation of the PPV, because low-prevalence diagnostic codes are more likely to be false positives. Failure to adjust for oversampling of claims that fall within the low-prevalence diagnostic group relative to those that do not leads to the overestimation of sensitivity and underestimation of specificity.

  11. Experimental investigation of stability, frequency and toroidal mode number of compressional Alfvén eigenmodes in DIII-D

    NASA Astrophysics Data System (ADS)

    Tang, S.; Thome, K.; Pace, D.; Heidbrink, W. W.; Carter, T. A.; Crocker, N. A.; NSTX-U Collaboration; DIII-D Collaboration

    2017-10-01

    An experimental investigation of the stability of Doppler-shifted cyclotron resonant compressional Alfvén eigenmodes (CAE) using the flexible DIII-D neutral beams has begun to validate a theoretical understanding and realize the CAE's diagnostic potential. CAEs are excited by energetic ions from neutral beams [Heidbrink, NF 2006], with frequencies and toroidal mode numbers sensitive to the fast-ion phase space distribution, making them a potentially powerful passive diagnostic. The experiment also contributes to a predictive capability for spherical tokamak temperature profiles, where CAEs may play a role in energy transport [Crocker, NF 2013]. CAE activity was observed using the recently developed Ion Cyclotron Emission diagnostic-high bandwidth edge magnetic sensors sampled at 200 MS/s. Preliminary results show CAEs become unstable in BT ramp discharges below a critical threshold in the range 1.7 - 1.9 T, with the exact value increasing as density increases. The experiment will be used to validate simulations from relevant codes such as the Hybrid MHD code [Belova, PRL 2015]. This work was supported by US DOE Grants DE-SC0011810 and DE-FC02-04ER54698.

  12. Single-Shot Scalar-Triplet Measurements in High-Pressure Swirl-Stabilized Flames for Combustion Code Validation

    NASA Technical Reports Server (NTRS)

    Kojima, Jun; Nguyen, Quang-Viet

    2007-01-01

    In support of NASA ARMD's code validation project, we have made significant progress by providing the first quantitative single-shot multi-scalar data from a turbulent elevated-pressure (5 atm), swirl-stabilized, lean direct injection (LDI) type research burner operating on CH4-air using a spatially-resolved pulsed-laser spontaneous Raman diagnostic technique. The Raman diagnostics apparatus and data analysis that we present here were developed over the past 6 years at Glenn Research Center. From the Raman scattering data, we produce spatially-mapped probability density functions (PDFs) of the instantaneous temperature, determined using a newly developed low-resolution effective rotational bandwidth (ERB) technique. The measured 3-scalar (triplet) correlations, between temperature, CH4, and O2 concentrations, as well as their PDF s, also provide a high-level of detail into the nature and extent of the turbulent mixing process and its impact on chemical reactions in a realistic gas turbine injector flame at elevated pressures. The multi-scalar triplet data set presented here provides a good validation case for CFD combustion codes to simulate by providing both average and statistical values for the 3 measured scalars.

  13. Investigating inertial confinement fusion target fuel conditions through x-ray spectroscopya)

    NASA Astrophysics Data System (ADS)

    Hansen, Stephanie B.

    2012-05-01

    Inertial confinement fusion (ICF) targets are designed to produce hot, dense fuel in a neutron-producing core that is surrounded by a shell of compressing material. The x-rays emitted from ICF plasmas can be analyzed to reveal details of the temperatures, densities, gradients, velocities, and mix characteristics of ICF targets. Such diagnostics are critical to understand the target performance and to improve the predictive power of simulation codes.

  14. Synthetic NPA diagnostic for energetic particles in JET plasmas

    NASA Astrophysics Data System (ADS)

    Varje, J.; Sirén, P.; Weisen, H.; Kurki-Suonio, T.; Äkäslompolo, S.; contributors, JET

    2017-11-01

    Neutral particle analysis (NPA) is one of the few methods for diagnosing fast ions inside a plasma by measuring neutral atom fluxes emitted due to charge exchange reactions. The JET tokamak features an NPA diagnostic which measures neutral atom fluxes and energy spectra simultaneously for hydrogen, deuterium and tritium species. A synthetic NPA diagnostic has been developed and used to interpret these measurements to diagnose energetic particles in JET plasmas with neutral beam injection (NBI) heating. The synthetic NPA diagnostic performs a Monte Carlo calculation of the neutral atom fluxes in a realistic geometry. The 4D fast ion distributions, representing NBI ions, were simulated using the Monte Carlo orbit-following code ASCOT. Neutral atom density profiles were calculated using the FRANTIC neutral code in the JINTRAC modelling suite. Additionally, for rapid analysis, a scan of neutral profiles was precalculated with FRANTIC for a range of typical plasma parameters. These were taken from the JETPEAK database, which includes a comprehensive set of data from the flat-top phases of nearly all discharges in recent JET campaigns. The synthetic diagnostic was applied to various JET plasmas in the recent hydrogen campaign where different hydrogen/deuterium mixtures and NBI configurations were used. The simulated neutral fluxes from the fast ion distributions were found to agree with the measured fluxes, reproducing the slowing-down profiles for different beam isotopes and energies and quantitatively estimating the fraction of hydrogen and deuterium fast ions.

  15. Evaluating a Dental Diagnostic Terminology in an Electronic Health Record

    PubMed Central

    White, Joel M.; Kalenderian, Elsbeth; Stark, Paul C.; Ramoni, Rachel L.; Vaderhobli, Ram; Walji, Muhammad F.

    2011-01-01

    Standardized treatment procedure codes and terms are routinely used in dentistry. Utilization of a diagnostic terminology is common in medicine, but there is not a satisfactory or commonly standardized dental diagnostic terminology available at this time. Recent advances in dental informatics have provided an opportunity for inclusion of diagnostic codes and terms as part of treatment planning and documentation in the patient treatment history. This article reports the results of the use of a diagnostic coding system in a large dental school’s predoctoral clinical practice. A list of diagnostic codes and terms, called Z codes, was developed by dental faculty members. The diagnostic codes and terms were implemented into an electronic health record (EHR) for use in a predoctoral dental clinic. The utilization of diagnostic terms was quantified. The validity of Z code entry was evaluated by comparing the diagnostic term entered to the procedure performed, where valid diagnosis-procedure associations were determined by consensus among three calibrated academically based dentists. A total of 115,004 dental procedures were entered into the EHR during the year sampled. Of those, 43,053 were excluded from this analysis because they represent diagnosis or other procedures unrelated to treatments. Among the 71,951 treatment procedures, 27,973 had diagnoses assigned to them with an overall utilization of 38.9 percent. Of the 147 available Z codes, ninety-three were used (63.3 percent). There were 335 unique procedures provided and 2,127 procedure/diagnosis pairs captured in the EHR. Overall, 76.7 percent of the diagnoses entered were valid. We conclude that dental diagnostic terminology can be incorporated within an electronic health record and utilized in an academic clinical environment. Challenges remain in the development of terms and implementation and ease of use that, if resolved, would improve the utilization. PMID:21546594

  16. High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.

    PubMed

    Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel

    2018-06-19

    Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.

  17. HAARP-Induced Ionospheric Ducts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milikh, Gennady; Vartanyan, Aram

    2011-01-04

    It is well known that strong electron heating by a powerful HF-facility can lead to the formation of electron and ion density perturbations that stretch along the magnetic field line. Those density perturbations can serve as ducts for ELF waves, both of natural and artificial origin. This paper presents observations of the plasma density perturbations caused by the HF-heating of the ionosphere by the HAARP facility. The low orbit satellite DEMETER was used as a diagnostic tool to measure the electron and ion temperature and density along the satellite orbit overflying close to the magnetic zenith of the HF-heater. Thosemore » observations will be then checked against the theoretical model of duct formation due to HF-heating of the ionosphere. The model is based on the modified SAMI2 code, and is validated by comparison with well documented experiments.« less

  18. Recent Progress and Future Plans for Fusion Plasma Synthetic Diagnostics Platform

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Kramer, Gerrit; Tang, William; Tobias, Benjamin; Valeo, Ernest; Churchill, Randy; Hausammann, Loic

    2015-11-01

    The Fusion Plasma Synthetic Diagnostics Platform (FPSDP) is a Python package developed at the Princeton Plasma Physics Laboratory. It is dedicated to providing an integrated programmable environment for applying a modern ensemble of synthetic diagnostics to the experimental validation of fusion plasma simulation codes. The FPSDP will allow physicists to directly compare key laboratory measurements to simulation results. This enables deeper understanding of experimental data, more realistic validation of simulation codes, quantitative assessment of existing diagnostics, and new capabilities for the design and optimization of future diagnostics. The Fusion Plasma Synthetic Diagnostics Platform now has data interfaces for the GTS and XGC-1 global particle-in-cell simulation codes with synthetic diagnostic modules including: (i) 2D and 3D Reflectometry; (ii) Beam Emission Spectroscopy; and (iii) 1D Electron Cyclotron Emission. Results will be reported on the delivery of interfaces for the global electromagnetic PIC code GTC, the extended MHD M3D-C1 code, and the electromagnetic hybrid NOVAK eigenmode code. Progress toward development of a more comprehensive 2D Electron Cyclotron Emission module will also be discussed. This work is supported by DOE contract #DEAC02-09CH11466.

  19. Efficient genome-wide association in biobanks using topic modeling identifies multiple novel disease loci.

    PubMed

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-08-31

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that may be unreliable and fail to capture the relationship between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records (EHR) for 10845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes are included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p<1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than for single phenome-wide diagnostic codes, and incorporation of less strongly-loading diagnostic codes enhanced association. This strategy provides a more efficient means of phenome-wide association in biobanks with coded clinical data.

  20. Efficient Genome-wide Association in Biobanks Using Topic Modeling Identifies Multiple Novel Disease Loci

    PubMed Central

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-01-01

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that can be unreliable and fail to capture relationships between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records for 10,845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted a genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes were included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p < 1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than single phenome-wide diagnostic codes, and incorporation of less strongly loading diagnostic codes enhanced association. This strategy provides a more efficient means of identifying phenome-wide associations in biobanks with coded clinical data. PMID:28861588

  1. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...

  2. Deciphering the Diagnostic Codes: A Guide for School Counselors. Practical Skills for Counselors.

    ERIC Educational Resources Information Center

    Jones, W. Paul

    Although school counselors have more contact with children and adolescents than most other human service professionals, they are frequently left out of discussions on diagnostic coding. Ways in which school counselors can use the codes in the Diagnostic and Statistical Manual of Mental Disorders IV (DSM-IV) are explored in this text. The book…

  3. Cryogenic THD and DT layer implosions with high density carbon ablators in near-vacuum hohlraums

    DOE PAGES

    Meezan, N. B.; Berzak Hopkins, L. F.; Le Pape, S.; ...

    2015-06-02

    High Density Carbon (HDC or diamond) is a promising ablator material for use in near-vacuum hohlraums, as its high density allows for ignition designs with laser pulse durations of <10 ns. A series of Inertial Confinement Fusion (ICF) experiments in 2013 on the National Ignition Facility [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] culminated in a DT layered implosion driven by a 6.8 ns, 2-shock laser pulse. This paper describes these experiments and comparisons with ICF design code simulations. Backlit radiography of a THD layered capsule demonstrated an ablator implosion velocity of 385 km/s with a slightlymore » oblate hot spot shape. Other diagnostics suggested an asymmetric compressed fuel layer. A streak camera-based hot spot self-emission diagnostic (SPIDER) showed a double-peaked history of the capsule self-emission. Simulations suggest that this is a signature of low quality hot spot formation. Changes to the laser pulse and pointing for a subsequent DT implosion resulted in a higher temperature, prolate hot spot and a thermonuclear yield of 1.8 x 10¹⁵ neutrons, 40% of the 1D simulated yield.« less

  4. X-ray absorption of a warm dense aluminum plasma created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Lecherbourg, L.; Renaudin, P.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Audebert, P.

    2007-05-01

    Point-projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. The 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a low-temperature regime. Independent plasma characterization was obtained using frequency domain interferometry diagnostic (FDI) that allows the interpretation of the absorption spectra in terms of spectral opacities. A detailed opacity code using the density and temperature inferred from the FDI reproduce the measured absorption spectra except in the last stage of the recombination phase.

  5. Direct measurements of anode/cathode gap plasma in cylindrically imploding loads on the Z machine

    NASA Astrophysics Data System (ADS)

    Porwitzky, A.; Dolan, D. H.; Martin, M. R.; Laity, G.; Lemke, R. W.; Mattsson, T. R.

    2018-06-01

    By deploying a photon Doppler velocimetry based plasma diagnostic, we have directly observed low density plasma in the load anode/cathode gap of cylindrically converging pulsed power targets. The arrival of this plasma is temporally correlated with gross current loss and subtle power flow differences between the anode and the cathode. The density is in the range where Hall terms in the electromagnetic equations are relevant, but this physics is lacking in the magnetohydrodynamics codes commonly used to design, analyze, and optimize pulsed power experiments. The present work presents evidence of the importance of physics beyond traditional resistive magnetohydrodynamics for the design of pulsed power targets and drivers.

  6. Administrative database code accuracy did not vary notably with changes in disease prevalence.

    PubMed

    van Walraven, Carl; English, Shane; Austin, Peter C

    2016-11-01

    Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Ionization-potential depression and other dense plasma statistical property studies - Application to spectroscopic diagnostics.

    NASA Astrophysics Data System (ADS)

    Calisti, Annette; Ferri, Sandrine; Mossé, Caroline; Talin, Bernard

    2017-02-01

    The radiative properties of an emitter surrounded by a plasma, are modified through various mechanisms. For instance the line shapes emitted by bound-bound transitions are broadened and carry useful information for plasma diagnostics. Depending on plasma conditions the electrons occupying the upper quantum levels of radiators no longer exist as they belong to the plasma free electron population. All the charges present in the radiator environment contribute to the lowering of the energy required to free an electron in the fundamental state. This mechanism is known as ionization potential depression (IPD). The knowledge of IPD is useful as it affects both the radiative properties of the various ionic states and their populations. Its evaluation deals with highly complex n-body coupled systems, involving particles with different dynamics and attractive ion-electron forces. A classical molecular dynamics (MD) code, the BinGo-TCP code, has been recently developed to simulate neutral multi-component (various charge state ions and electrons) plasma accounting for all the charge correlations. In the present work, results on IPD and other dense plasma statistical properties obtained using the BinGo-TCP code are presented. The study focuses on aluminum plasmas for different densities and several temperatures in order to explore different plasma coupling conditions.

  8. Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.

    PubMed

    Tauber, J; Lahav, M

    1987-11-01

    A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.

  9. Inferring physical properties of galaxies from their emission-line spectra

    NASA Astrophysics Data System (ADS)

    Ucci, G.; Ferrara, A.; Gallerani, S.; Pallottini, A.

    2017-02-01

    We present a new approach based on Supervised Machine Learning algorithms to infer key physical properties of galaxies (density, metallicity, column density and ionization parameter) from their emission-line spectra. We introduce a numerical code (called GAME, GAlaxy Machine learning for Emission lines) implementing this method and test it extensively. GAME delivers excellent predictive performances, especially for estimates of metallicity and column densities. We compare GAME with the most widely used diagnostics (e.g. R23, [N II] λ6584/Hα indicators) showing that it provides much better accuracy and wider applicability range. GAME is particularly suitable for use in combination with Integral Field Unit spectroscopy, both for rest-frame optical/UV nebular lines and far-infrared/sub-millimeter lines arising from photodissociation regions. Finally, GAME can also be applied to the analysis of synthetic galaxy maps built from numerical simulations.

  10. Electron density diagnostics for gaseous nebulae involving the O 4 intercombination lines near 1400 A

    NASA Technical Reports Server (NTRS)

    Keenan, F. P.; Conlon, E. S.; Bowden, D. A.; Feibelman, W. A.; Pradhan, Anil K.

    1992-01-01

    Theoretical O IV electron density sensitive emission line ratios, determined using electron impact excitation rates calculated with the R-matrix code, are presented for R(sub 1) = I(1407.4 A)/I(1401.2 A), R(sub 2) = I(1404.8 A)/I(1401.2A), R(sub 3) = I(1399.8 A)/(1401.2 A), and R(sub 4) = I(1397.2 A)/I(1401.2 A). The observed values of R(sub 1)-R(sub 4), measured from high resolution spectra obtained with the International Ultraviolet Explorer (IUE) satellite, lead to electron densities that are compatible, and which are also in good agreement with those deduced from line ratios in other species. This provides observational support for the accuracy of the atomic data adopted in the present calculations.

  11. Dental Faculty Accuracy When Using Diagnostic Codes: A Pilot Study.

    PubMed

    Sutton, Jeanne C; Fay, Rose-Marie; Huynh, Carolyn P; Johnson, Cleverick D; Zhu, Liang; Quock, Ryan L

    2017-05-01

    The aim of this study was to examine the accuracy of dental faculty members' utilization of diagnostic codes and resulting treatment planning based on radiographic interproximal tooth radiolucencies. In 2015, 50 full-time and part-time general dentistry faculty members at one U.S. dental school were shown a sequence of 15 bitewing radiographs; one interproximal radiolucency was highlighted on each bitewing. For each radiographic lesion, participants were asked to choose the most appropriate diagnostic code (from a concise list of five codes, corresponding to lesion progression to outer/inner halves of enamel and outer/middle/pulpal thirds of dentin), acute treatment (attempt to arrest/remineralize non-invasively, operative intervention, or no treatment), and level of confidence in choices. Diagnostic and treatment choices of participants were compared to "gold standard" correct responses, as determined by expert radiology and operative faculty members, respectively. The majority of the participants selected the correct diagnostic code for lesions in the outer one-third of dentin (p<0.0001) and the pulpal one-third of dentin (p<0.0001). For lesions in the outer and inner halves of enamel and the middle one-third of dentin, the correct rates were moderate. However, the majority of the participants chose correct treatments on all types of lesions (correct rate 63.6-100%). Faculty members' confidence in their responses was generally high for all lesions, all above 90%. Diagnostic codes were appropriately assigned by participants for the very deepest lesions, but they were not assigned accurately for more incipient lesions (limited to enamel). Paradoxically, treatment choices were generally correct, regardless of diagnostic choices. Further calibration is needed to improve faculty use and teaching of diagnostic codes.

  12. The Cloud Feedback Model Intercomparison Project (CFMIP) Diagnostic Codes Catalogue – metrics, diagnostics and methodologies to evaluate, understand and improve the representation of clouds and cloud feedbacks in climate models

    DOE PAGES

    Tsushima, Yoko; Brient, Florent; Klein, Stephen A.; ...

    2017-11-27

    The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less

  13. The Cloud Feedback Model Intercomparison Project (CFMIP) Diagnostic Codes Catalogue – metrics, diagnostics and methodologies to evaluate, understand and improve the representation of clouds and cloud feedbacks in climate models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsushima, Yoko; Brient, Florent; Klein, Stephen A.

    The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less

  14. Three-dimensional modeling of the neutron spectrum to infer plasma conditions in cryogenic inertial confinement fusion implosions

    NASA Astrophysics Data System (ADS)

    Weilacher, F.; Radha, P. B.; Forrest, C.

    2018-04-01

    Neutron-based diagnostics are typically used to infer compressed core conditions such as areal density and ion temperature in deuterium-tritium (D-T) inertial confinement fusion (ICF) implosions. Asymmetries in the observed neutron-related quantities are important to understanding failure modes in these implosions. Neutrons from fusion reactions and their subsequent interactions including elastic scattering and neutron-induced deuteron breakup reactions are tracked to create spectra. It is shown that background subtraction is important for inferring areal density from backscattered neutrons and is less important for the forward-scattered neutrons. A three-dimensional hydrodynamic simulation of a cryogenic implosion on the OMEGA Laser System [Boehly et al., Opt. Commun. 133, 495 (1997)] using the hydrodynamic code HYDRA [Marinak et al., Phys. Plasmas 8, 2275 (2001)] is post-processed using the tracking code IRIS3D. It is shown that different parts of the neutron spectrum from the view can be mapped into different regions of the implosion, enabling an inference of an areal-density map. It is also shown that the average areal-density and an areal-density map of the compressed target can be reconstructed with a finite number of detectors placed around the target chamber. Ion temperatures are inferred from the width of the D-D and D-T fusion neutron spectra. Backgrounds can significantly alter the inferred ion temperatures from the D-D reaction, whereas they insignificantly influence the inferred D-T ion temperatures for the areal densities typical of OMEGA implosions. Asymmetries resulting in fluid flow in the core are shown to influence the absolute inferred ion temperatures from both reactions, although relative inferred values continue to reflect the underlying asymmetry pattern. The work presented here is part of the wide range of the first set of studies performed with IRIS3D. This code will continue to be used for post-processing detailed hydrodynamic simulations and interpreting observed neutron spectra in ICF implosions.

  15. Coding in Muscle Disease.

    PubMed

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  16. Species and temperature measurement in H2/O2 rocket flow fields by means of Raman scattering diagnostics

    NASA Technical Reports Server (NTRS)

    De Groot, Wim A.; Weiss, Jonathan M.

    1992-01-01

    Validation of CFD codes developed for prediction and evaluation of rocket performance is hampered by a lack of experimental data. Nonintrusive laser based diagnostics are needed to provide spatially and temporally resolved gas dynamic and fluid dynamic measurements. This paper reports the first nonintrusive temperature and species measurements in the plume of a 110 N gaseous hydrogen/oxygen thruster at and below ambient pressures, obtained with spontaneous Raman spectroscopy. Measurements at 10 mm downstream of the exit plane are compared with predictions from a numerical solution of the axisymmetric Navier-Stokes and species transport equations with chemical kinetics, which fully model the combustor-nozzle-plume flowfield. The experimentally determined oxygen number density at the centerline at 10 mm downstream of the exit plane is four times that predicted by the model. The experimental number density data fall between those numerically predicted for the exit and 10 mm downstream planes in both magnitude and radial gradient. The predicted temperature levels are within 10 to 15 percent of measured values.

  17. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D. Q.; Flocke, N.; Graziani, C.; Tzeferacos, P.; Weide, K.

    2016-10-01

    FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities have been added to FLASH to make it an open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. In particular, we showcase the ability of FLASH to simulate the Faraday Rotation Measure produced by the presence of magnetic fields; and proton radiography, proton self-emission, and Thomson scattering diagnostics with and without the presence of magnetic fields. We also describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under Grant PHY-0903997.

  18. ZaP-HD: High Energy Density Z-Pinch Plasmas using Sheared Flow Stabilization

    NASA Astrophysics Data System (ADS)

    Golingo, R. P.; Shumlak, U.; Nelson, B. A.; Claveau, E. L.; Doty, S. A.; Forbes, E. G.; Hughes, M. C.; Kim, B.; Ross, M. P.; Weed, J. R.

    2015-11-01

    The ZaP-HD flow Z-pinch project investigates scaling the flow Z-pinch to High Energy Density Plasma, HEDP, conditions by using sheared flow stabilization. ZaP used a single power supply to produce 100 cm long Z-pinches that were quiescent for many radial Alfven times and axial flow-through times. The flow Z-pinch concept provides an approach to achieve HED plasmas, which are dimensionally large and persist for extended durations. The ZaP-HD device replaces the single power supply from ZaP with two separate power supplies to independently control the plasma flow and current in the Z-pinch. Equilibrium is determined by diagnostic measurements of the density with interferometry and digital holography, the plasma flow and temperature with passive spectroscopy, the magnetic field with surface magnetic probes, and plasma emission with optical imaging. The diagnostics fully characterize the plasma from its initiation in the coaxial accelerator, through the pinch, and exhaust from the assembly region. The plasma evolution is modeled with high resolution codes: Mach2, WARPX, and NIMROD. Experimental results and scaling analyses are presented. This work is supported by grants from the U.S. Department of Energy and the U.S. National Nuclear Security Administration.

  19. Cryogenic tritium-hydrogen-deuterium and deuterium-tritium layer implosions with high density carbon ablators in near-vacuum hohlraums

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meezan, N. B., E-mail: meezan1@llnl.gov; Hopkins, L. F. Berzak; Pape, S. Le

    2015-06-15

    High Density Carbon (or diamond) is a promising ablator material for use in near-vacuum hohlraums, as its high density allows for ignition designs with laser pulse durations of <10 ns. A series of Inertial Confinement Fusion (ICF) experiments in 2013 on the National Ignition Facility [Moses et al., Phys. Plasmas 16, 041006 (2009)] culminated in a deuterium-tritium (DT) layered implosion driven by a 6.8 ns, 2-shock laser pulse. This paper describes these experiments and comparisons with ICF design code simulations. Backlit radiography of a tritium-hydrogen-deuterium (THD) layered capsule demonstrated an ablator implosion velocity of 385 km/s with a slightly oblate hot spot shape.more » Other diagnostics suggested an asymmetric compressed fuel layer. A streak camera-based hot spot self-emission diagnostic (SPIDER) showed a double-peaked history of the capsule self-emission. Simulations suggest that this is a signature of low quality hot spot formation. Changes to the laser pulse and pointing for a subsequent DT implosion resulted in a higher temperature, prolate hot spot and a thermonuclear yield of 1.8 × 10{sup 15} neutrons, 40% of the 1D simulated yield.« less

  20. Epidemiology of angina pectoris: role of natural language processing of the medical record

    PubMed Central

    Pakhomov, Serguei; Hemingway, Harry; Weston, Susan A.; Jacobsen, Steven J.; Rodeheffer, Richard; Roger, Véronique L.

    2007-01-01

    Background The diagnosis of angina is challenging as it relies on symptom descriptions. Natural language processing (NLP) of the electronic medical record (EMR) can provide access to such information contained in free text that may not be fully captured by conventional diagnostic coding. Objective To test the hypothesis that NLP of the EMR improves angina pectoris (AP) ascertainment over diagnostic codes. Methods Billing records of in- and out-patients were searched for ICD-9 codes for AP, chronic ischemic heart disease and chest pain. EMR clinical reports were searched electronically for 50 specific non-negated natural language synonyms to these ICD-9 codes. The two methods were compared to a standardized assessment of angina by Rose questionnaire for three diagnostic levels: unspecified chest pain, exertional chest pain, and Rose angina. Results Compared to the Rose questionnaire, the true positive rate of EMR-NLP for unspecified chest pain was 62% (95%CI:55–67) vs. 51% (95%CI:44–58) for diagnostic codes (p<0.001). For exertional chest pain, the EMR-NLP true positive rate was 71% (95%CI:61–80) vs. 62% (95%CI:52–73) for diagnostic codes (p=0.10). Both approaches had 88% (95%CI:65–100) true positive rate for Rose angina. The EMR-NLP method consistently identified more patients with exertional chest pain over 28-month follow-up. Conclusion EMR-NLP method improves the detection of unspecified and exertional chest pain cases compared to diagnostic codes. These findings have implications for epidemiological and clinical studies of angina pectoris. PMID:17383310

  1. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, X., E-mail: xzm0005@auburn.edu; Maurer, D. A.; Knowlton, S. F.

    2015-12-15

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used tomore » infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  2. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Maurer, D. A.; Knowlton, S. F.; ArchMiller, M. C.; Cianciosa, M. R.; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Hebert, J. D.; Herfindal, J. L.; Pandya, M. D.; Roberds, N. A.; Traverso, P. J.

    2015-12-01

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.

  3. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE PAGES

    Ma, X.; Maurer, D. A.; Knowlton, Stephen F.; ...

    2015-12-22

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. Lastly, the inversion radius of standard saw-teeth is usedmore » to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  4. The polarization signature from the circumstellar disks of classical Be stars

    NASA Astrophysics Data System (ADS)

    Halonen, R. J.; Jones, C. E.

    2012-05-01

    The scattering of light in the nonspherical circumstellar envelopes of classical Be stars produces distinct polarimetric properties that can be used to investigate the physical nature of the scattering environment. Both the continuum and emission line polarization are potentially important diagnostic tools in the modeling of these systems. We combine the use of a new multiple scattering code with an established non-LTE radiative transfer code to study the characteristic wavelength-dependence of the intrinsic polarization of classical Be stars. We construct models using realistic chemical composition and self-consistent calculations of the thermal structure of the disk, and then determine the fraction of emergent polarized light. In particular, the aim of this theoretical research project is to investigate the effect of gas density and metallicity on the observed polarization properties of classical Be stars.

  5. A Comparative Study on Diagnostic Accuracy of Colour Coded Digital Images, Direct Digital Images and Conventional Radiographs for Periapical Lesions – An In Vitro Study

    PubMed Central

    Mubeen; K.R., Vijayalakshmi; Bhuyan, Sanat Kumar; Panigrahi, Rajat G; Priyadarshini, Smita R; Misra, Satyaranjan; Singh, Chandravir

    2014-01-01

    Objectives: The identification and radiographic interpretation of periapical bone lesions is important for accurate diagnosis and treatment. The present study was undertaken to study the feasibility and diagnostic accuracy of colour coded digital radiographs in terms of presence and size of lesion and to compare the diagnostic accuracy of colour coded digital images with direct digital images and conventional radiographs for assessing periapical lesions. Materials and Methods: Sixty human dry cadaver hemimandibles were obtained and periapical lesions were created in first and second premolar teeth at the junction of cancellous and cortical bone using a micromotor handpiece and carbide burs of sizes 2, 4 and 6. After each successive use of round burs, a conventional, RVG and colour coded image was taken for each specimen. All the images were evaluated by three observers. The diagnostic accuracy for each bur and image mode was calculated statistically. Results: Our results showed good interobserver (kappa > 0.61) agreement for the different radiographic techniques and for the different bur sizes. Conventional Radiography outperformed Digital Radiography in diagnosing periapical lesions made with Size two bur. Both were equally diagnostic for lesions made with larger bur sizes. Colour coding method was least accurate among all the techniques. Conclusion: Conventional radiography traditionally forms the backbone in the diagnosis, treatment planning and follow-up of periapical lesions. Direct digital imaging is an efficient technique, in diagnostic sense. Colour coding of digital radiography was feasible but less accurate however, this imaging technique, like any other, needs to be studied continuously with the emphasis on safety of patients and diagnostic quality of images. PMID:25584318

  6. Clinician's Primer to ICD-10-CM Coding for Cleft Lip/Palate Care.

    PubMed

    Allori, Alexander C; Cragan, Janet D; Della Porta, Gina C; Mulliken, John B; Meara, John G; Bruun, Richard; Shusterman, Stephen; Cassell, Cynthia H; Raynor, Eileen; Santiago, Pedro; Marcus, Jeffrey R

    2017-01-01

    On October 1, 2015, the United States required use of the Clinical Modification of the International Classification of Diseases, 10th Revision (ICD-10-CM) for diagnostic coding. This primer was written to assist the cleft care community with understanding and use of ICD-10-CM for diagnostic coding related to cleft lip and/or palate (CL/P).

  7. Three-dimensional modeling of the neutron spectrum to infer plasma conditions in cryogenic inertial confinement fusion implosions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weilacher, F.; Radha, P. B.; Forrest, C.

    Neutron-based diagnostics are typically used to infer compressed core conditions such as areal density and ion temperature in deuterium–tritium (D–T) inertial confinement fusion (ICF) implosions. Asymmetries in the observed neutron-related quantities are important to understanding failure modes in these implosions. Neutrons from fusion reactions and their subsequent interactions including elastic scattering and neutron-induced deuteron breakup reactions are tracked to create spectra. Here, it is shown that background subtraction is important for inferring areal density from backscattered neutrons and is less important for the forward-scattered neutrons. A three-dimensional hydrodynamic simulation of a cryogenic implosion on the OMEGA Laser System [T. R.more » Boehly et al., Opt. Commun. 133, 495 (1997)] using the hydrodynamic code HYDRA [M. M. Marinak et al., Phys. Plasmas 8, 2275 (2001)] is post-processed using the tracking code IRIS3D. It is shown that different parts of the neutron spectrum from the view can be mapped into different regions of the implosion, enabling an inference of an areal-density map. It is also shown that the average areal-density and an areal-density map of the compressed target can be reconstructed with a finite number of detectors placed around the target chamber. Ion temperatures are inferred from the width of the D–D and D–T fusion neutron spectra. Backgrounds can significantly alter the inferred ion temperatures from the D–D reaction, whereas they insignificantly influence the inferred D–T ion temperatures for the areal densities typical of OMEGA implosions. Asymmetries resulting in fluid flow in the core are shown to influence the absolute inferred ion temperatures from both reactions, although relative inferred values continue to reflect the underlying asymmetry pattern. The work presented here is part of the wide range of the first set of studies performed with IRIS3D. Finally, this code will continue to be used for post-processing detailed hydrodynamic simulations and interpreting observed neutron spectra in ICF implosions.« less

  8. Three-dimensional modeling of the neutron spectrum to infer plasma conditions in cryogenic inertial confinement fusion implosions

    DOE PAGES

    Weilacher, F.; Radha, P. B.; Forrest, C.

    2018-04-26

    Neutron-based diagnostics are typically used to infer compressed core conditions such as areal density and ion temperature in deuterium–tritium (D–T) inertial confinement fusion (ICF) implosions. Asymmetries in the observed neutron-related quantities are important to understanding failure modes in these implosions. Neutrons from fusion reactions and their subsequent interactions including elastic scattering and neutron-induced deuteron breakup reactions are tracked to create spectra. Here, it is shown that background subtraction is important for inferring areal density from backscattered neutrons and is less important for the forward-scattered neutrons. A three-dimensional hydrodynamic simulation of a cryogenic implosion on the OMEGA Laser System [T. R.more » Boehly et al., Opt. Commun. 133, 495 (1997)] using the hydrodynamic code HYDRA [M. M. Marinak et al., Phys. Plasmas 8, 2275 (2001)] is post-processed using the tracking code IRIS3D. It is shown that different parts of the neutron spectrum from the view can be mapped into different regions of the implosion, enabling an inference of an areal-density map. It is also shown that the average areal-density and an areal-density map of the compressed target can be reconstructed with a finite number of detectors placed around the target chamber. Ion temperatures are inferred from the width of the D–D and D–T fusion neutron spectra. Backgrounds can significantly alter the inferred ion temperatures from the D–D reaction, whereas they insignificantly influence the inferred D–T ion temperatures for the areal densities typical of OMEGA implosions. Asymmetries resulting in fluid flow in the core are shown to influence the absolute inferred ion temperatures from both reactions, although relative inferred values continue to reflect the underlying asymmetry pattern. The work presented here is part of the wide range of the first set of studies performed with IRIS3D. Finally, this code will continue to be used for post-processing detailed hydrodynamic simulations and interpreting observed neutron spectra in ICF implosions.« less

  9. 38 CFR 4.100 - Application of the evaluation criteria for diagnostic codes 7000-7007, 7011, and 7015-7020.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...

  10. 38 CFR 4.100 - Application of the evaluation criteria for diagnostic codes 7000-7007, 7011, and 7015-7020.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...

  11. 38 CFR 4.100 - Application of the evaluation criteria for diagnostic codes 7000-7007, 7011, and 7015-7020.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...

  12. 38 CFR 4.100 - Application of the evaluation criteria for diagnostic codes 7000-7007, 7011, and 7015-7020.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...

  13. 38 CFR 4.100 - Application of the evaluation criteria for diagnostic codes 7000-7007, 7011, and 7015-7020.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...

  14. An investigation into the role of metastable states on excited populations of weakly ionized argon plasmas, with applications for optical diagnostics

    NASA Astrophysics Data System (ADS)

    Arnold, Nicholas; Loch, Stuart; Ballance, Connor; Thomas, Ed

    2017-10-01

    Low temperature plasmas (Te < 10 eV) are ubiquitous in the medical, industrial, basic, and dusty plasma communities, and offer an opportunity for researchers to gain a better understanding of atomic processes in plasmas. Here, we report on a new atomic dataset for neutral and low charge states of argon, from which rate coefficients and cross-sections for the electron-impact excitation of neutral argon are determined. We benchmark by comparing with electron impact excitation cross-sections available in the literature, with very good agreement. We have used the Atomic Data and Analysis Structure (ADAS) code suite to calculate a level-resolved, generalized collisional-radiative (GCR) model for line emission in low temperature argon plasmas. By combining our theoretical model with experimental electron temperature, density, and spectral measurements from the Auburn Linear eXperiment for Instability Studies (ALEXIS), we have developed diagnostic techniques to measure metastable fraction, electron temperature, and electron density. In the future we hope to refine our methods, and extend our model to plasmas other than ALEXIS. Supported by the U.S. Department of Energy. Grant Number: DE-FG02-00ER54476.

  15. Laser Rayleigh and Raman Diagnostics for Small Hydrogen/oxygen Rockets

    NASA Technical Reports Server (NTRS)

    Degroot, Wilhelmus A.; Zupanc, Frank J.

    1993-01-01

    Localized velocity, temperature, and species concentration measurements in rocket flow fields are needed to evaluate predictive computational fluid dynamics (CFD) codes and identify causes of poor rocket performance. Velocity, temperature, and total number density information have been successfully extracted from spectrally resolved Rayleigh scattering in the plume of small hydrogen/oxygen rockets. Light from a narrow band laser is scattered from the moving molecules with a Doppler shifted frequency. Two components of the velocity can be extracted by observing the scattered light from two directions. Thermal broadening of the scattered light provides a measure of the temperature, while the integrated scattering intensity is proportional to the number density. Spontaneous Raman scattering has been used to measure temperature and species concentration in similar plumes. Light from a dye laser is scattered by molecules in the rocket plume. Raman spectra scattered from major species are resolved by observing the inelastically scattered light with linear array mounted to a spectrometer. Temperature and oxygen concentrations have been extracted by fitting a model function to the measured Raman spectrum. Results of measurements on small rockets mounted inside a high altitude chamber using both diagnostic techniques are reported.

  16. Validation of Living Donor Nephrectomy Codes

    PubMed Central

    Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.

    2018-01-01

    Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679

  17. CHEMICAL EVOLUTION OF THE UNIVERSE AT 0.7 < z < 1.6 DERIVED FROM ABUNDANCE DIAGNOSTICS OF THE BROAD-LINE REGION OF QUASARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sameshima, H.; Yoshii, Y.; Kawara, K., E-mail: sameshima@cc.kyoto-su.ac.jp

    2017-01-10

    We present an analysis of Mg ii λ 2798 and Fe ii UV emission lines for archival Sloan Digital Sky Survey (SDSS) quasars to explore the diagnostics of the magnesium-to-iron abundance ratio in a broad-line region cloud. Our sample consists of 17,432 quasars selected from the SDSS Data Release 7 with a redshift range of 0.72 <  z  < 1.63. A strong anticorrelation between the Mg ii equivalent width (EW) and the Eddington ratio is found, while only a weak positive correlation is found between the Fe ii EW and the Eddington ratio. To investigate the origin of these differing behaviors ofmore » Mg ii and Fe ii emission lines, we perform photoionization calculations using the Cloudy code, where constraints from recent reverberation mapping studies are considered. We find from calculations that (1) Mg ii and Fe ii emission lines are created at different regions in a photoionized cloud, and (2) their EW correlations with the Eddington ratio can be explained by just changing the cloud gas density. These results indicate that the Mg ii/Fe ii flux ratio, which has been used as a first-order proxy for the Mg/Fe abundance ratio in chemical evolution studies with quasar emission lines, depends largely on the cloud gas density. By correcting this density dependence, we propose new diagnostics of the Mg/Fe abundance ratio for a broad-line region cloud. In comparing the derived Mg/Fe abundance ratios with chemical evolution models, we suggest that α -enrichment by mass loss from metal-poor intermediate-mass stars occurred at z  ∼ 2 or earlier.« less

  18. Property Changes in Aqueous Solutions due to Surfactant Treatment of PCE: Implications to Geophysical Measurements

    NASA Astrophysics Data System (ADS)

    Werkema, D. D.

    2007-12-01

    Select physicochemical properties of aqueous solutions composed of surfactants, dye, and perchloroethylene (PCE) were evaluated through a response surface quadratic design model of experiment. Nine surfactants, which are conventionally used in the remediation of PCE, were evaluated with varying concentrations of PCE and indicator dyes in aqueous solutions. Two hundred forty experiments were performed using PCE as a numerical factor (coded A) from 0 to 200 parts per million (ppm), dye type (coded B) as a 3-level categorical factor, and surfactant type (coded C) as a 10-level categorical factor. Five responses were measured: temperature (°C), pH, conductivity (μS/cm), dissolved oxygen (DO, mg/L), and density (g/mL). Diagnostics proved a normally distributed predictable response for all measured responses except pH. The Box-Cox plot for transforms recommended a power transform for the conductivity response with lambda (λ) = 0.50, and for the DO response, λ =2.2. The overall mean of the temperature response proved to be a better predictor than the linear model. The conductivity response is best fitted with a linear model using significant coded terms B and C. Both DO and density also showed a linear model with coded terms A, B, and C for DO; and terms A and C for density. Some of the surfactant treatments of PCE significantly alter the conductivity, DO, and density of the aqueous solution. However, the magnitude of the density response is so small that it does not exceed the instrument tolerance. Results for the conductivity and DO responses provide predictive models for the surfactant treatment of PCE and may be useful in determining the potential for geophysically monitoring surfactant enhanced aquifer remediation (SEAR) of PCE. As the aqueous physicochemical properties change due to surfactant remediation efforts, so will the properties of the subsurface pore water which are influential factors in geophysical measurements. Geoelectrical methods are potentially the best suited to measure SEAR alterations in the subsurface because the conductivity of the pore fluid has the largest relative change. This research has provided predictive models for alterations in the physicochemical properties of the pore fluid to SEAR of PCE. Future investigations should address the contribution of the solid matrix in the subsurface and the solid-fluid interaction during SEAR of PCE contamination. Notice: Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy. Mention of trade names or commercial products does not constitute endorsement or recommendation by EPA for use.

  19. Relativity Screens for Misvalued Medical Services: Impact on Noninvasive Diagnostic Radiology.

    PubMed

    Rosenkrantz, Andrew B; Silva, Ezequiel; Hawkins, C Matthew

    2017-11-01

    In 2006, the AMA/Specialty Society Relative Value Scale Update Committee (RUC) introduced ongoing relativity screens to identify potentially misvalued medical services for payment adjustments. We assess the impact of these screens upon the valuation of noninvasive diagnostic radiology services. Data regarding relativity screens and relative value unit (RVU) changes were obtained from the 2016 AMA Relativity Assessment Status Report. All global codes in the 2016 Medicare Physician Fee Schedule with associated work RVUs were classified as noninvasive diagnostic radiology services versus remaining services. The frequency of having ever undergone a screen was compared between the two groups. Screened radiology codes were further evaluated regarding the RVU impact of subsequent revaluation. Of noninvasive diagnostic radiology codes, 46.0% (201 of 437) were screened versus 22.2% (1,460 of 6,575) of remaining codes (P < .001). Most common screens for which radiology codes were identified as potentially misvalued were (1) high expenditures (27.5%) and (2) high utilization (25.6%). The modality and body region most likely to be identified in a screen were CT (82.1%) and breast (90.9%), respectively. Among screened radiology codes, work RVUs, practice expense RVUs, and nonfacility total RVUs decreased in 20.3%, 65.9%, and 75.3%, respectively. All screened CT, MRI, brain, and spine codes exhibited decreased total RVUs. Policymakers' ongoing search for potentially misvalued medical services has disproportionately impacted noninvasive diagnostic radiology services, risking the introduction of unintended or artificial shifts in physician practice. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Experimental measurements of hydrodynamic instabilities on NOVA of relevance to astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budil, K S; Cherfils, C; Drake, R P

    1998-09-11

    Large lasers such as Nova allow the possibility of achieving regimes of high energy densities in plasmas of millimeter spatial scales and nanosecond time scales. In those plasmas where thermal conductivity and viscosity do not play a significant role, the hydrodynamic evolution is suitable for benchmarking hydrodynamics modeling in astrophysical codes. Several experiments on Nova examine hydrodynamically unstable interfaces. A typical Nova experiment uses a gold millimeter-scale hohlraum to convert the laser energy to a 200 eV blackbody source lasting about a nanosecond. The x-rays ablate a planar target, generating a series of shocks and accelerating the target. The evolvingmore » area1 density is diagnosed by time-resolved radiography, using a second x-ray source. Data from several experiments are presented and diagnostic techniques are discussed.« less

  1. Dissemination and support of ARGUS for accelerator applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  2. Multi-channel transport experiments at Alcator C-Mod and comparison with gyrokinetic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, A. E.; Howard, N. T.; Greenwald, M.

    2013-05-15

    Multi-channel transport experiments have been conducted in auxiliary heated (Ion Cyclotron Range of Frequencies) L-mode plasmas at Alcator C-Mod [Marmar and Alcator C-Mod Group, Fusion Sci. Technol. 51(3), 3261 (2007)]. These plasmas provide good diagnostic coverage for measurements of kinetic profiles, impurity transport, and turbulence (electron temperature and density fluctuations). In the experiments, a steady sawtoothing L-mode plasma with 1.2 MW of on-axis RF heating is established and density is scanned by 20%. Measured rotation profiles change from peaked to hollow in shape as density is increased, but electron density and impurity profiles remain peaked. Ion or electron heat fluxesmore » from the two plasmas are the same. The experimental results are compared directly to nonlinear gyrokinetic theory using synthetic diagnostics and the code GYRO [Candy and Waltz, J. Comput. Phys. 186, 545 (2003)]. We find good agreement with experimental ion heat flux, impurity particle transport, and trends in the fluctuation level ratio (T(tilde sign){sub e}/T{sub e})/(ñ{sub e}/n{sub e}), but underprediction of electron heat flux. We find that changes in momentum transport (rotation profiles changing from peaked to hollow) do not correlate with changes in particle transport, and also do not correlate with changes in linear mode dominance, e.g., Ion Temperature Gradient versus Trapped Electron Mode. The new C-Mod results suggest that the drives for momentum transport differ from drives for heat and particle transport. The experimental results are inconsistent with present quasilinear models, and the strong sensitivity of core rotation to density remains unexplained.« less

  3. How do gut feelings feature in tutorial dialogues on diagnostic reasoning in GP traineeship?

    PubMed

    Stolper, C F; Van de Wiel, M W J; Hendriks, R H M; Van Royen, P; Van Bokhoven, M A; Van der Weijden, T; Dinant, G J

    2015-05-01

    Diagnostic reasoning is considered to be based on the interaction between analytical and non-analytical cognitive processes. Gut feelings, a specific form of non-analytical reasoning, play a substantial role in diagnostic reasoning by general practitioners (GPs) and may activate analytical reasoning. In GP traineeships in the Netherlands, trainees mostly see patients alone but regularly consult with their supervisors to discuss patients and problems, receive feedback, and improve their competencies. In the present study, we examined the discussions of supervisors and their trainees about diagnostic reasoning in these so-called tutorial dialogues and how gut feelings feature in these discussions. 17 tutorial dialogues focussing on diagnostic reasoning were video-recorded and transcribed and the protocols were analysed using a detailed bottom-up and iterative content analysis and coding procedure. The dialogues were segmented into quotes. Each quote received a content code and a participant code. The number of words per code was used as a unit of analysis to quantitatively compare the contributions to the dialogues made by supervisors and trainees, and the attention given to different topics. The dialogues were usually analytical reflections on a trainee's diagnostic reasoning. A hypothetico-deductive strategy was often used, by listing differential diagnoses and discussing what information guided the reasoning process and might confirm or exclude provisional hypotheses. Gut feelings were discussed in seven dialogues. They were used as a tool in diagnostic reasoning, inducing analytical reflection, sometimes on the entire diagnostic reasoning process. The emphasis in these tutorial dialogues was on analytical components of diagnostic reasoning. Discussing gut feelings in tutorial dialogues seems to be a good educational method to familiarize trainees with non-analytical reasoning. Supervisors need specialised knowledge about these aspects of diagnostic reasoning and how to deal with them in medical education.

  4. Understanding diagnostic variability in breast pathology: lessons learned from an expert consensus review panel

    PubMed Central

    Allison, Kimberly H; Reisch, Lisa M; Carney, Patricia A; Weaver, Donald L; Schnitt, Stuart J; O’Malley, Frances P; Geller, Berta M; Elmore, Joann G

    2015-01-01

    Aims To gain a better understanding of the reasons for diagnostic variability, with the aim of reducing the phenomenon. Methods and results In preparation for a study on the interpretation of breast specimens (B-PATH), a panel of three experienced breast pathologists reviewed 336 cases to develop consensus reference diagnoses. After independent assessment, cases coded as diagnostically discordant were discussed at consensus meetings. By the use of qualitative data analysis techniques, transcripts of 16 h of consensus meetings for a subset of 201 cases were analysed. Diagnostic variability could be attributed to three overall root causes: (i) pathologist-related; (ii) diagnostic coding/study methodology-related; and (iii) specimen-related. Most pathologist-related root causes were attributable to professional differences in pathologists’ opinions about whether the diagnostic criteria for a specific diagnosis were met, most frequently in cases of atypia. Diagnostic coding/study methodology-related root causes were primarily miscategorizations of descriptive text diagnoses, which led to the development of a standardized electronic diagnostic form (BPATH-Dx). Specimen-related root causes included artefacts, limited diagnostic material, and poor slide quality. After re-review and discussion, a consensus diagnosis could be assigned in all cases. Conclusions Diagnostic variability is related to multiple factors, but consensus conferences, standardized electronic reporting formats and comments on suboptimal specimen quality can be used to reduce diagnostic variability. PMID:24511905

  5. SPECT3D - A multi-dimensional collisional-radiative code for generating diagnostic signatures based on hydrodynamics and PIC simulation output

    NASA Astrophysics Data System (ADS)

    MacFarlane, J. J.; Golovkin, I. E.; Wang, P.; Woodruff, P. R.; Pereyra, N. A.

    2007-05-01

    SPECT3D is a multi-dimensional collisional-radiative code used to post-process the output from radiation-hydrodynamics (RH) and particle-in-cell (PIC) codes to generate diagnostic signatures (e.g. images, spectra) that can be compared directly with experimental measurements. This ability to post-process simulation code output plays a pivotal role in assessing the reliability of RH and PIC simulation codes and their physics models. SPECT3D has the capability to operate on plasmas in 1D, 2D, and 3D geometries. It computes a variety of diagnostic signatures that can be compared with experimental measurements, including: time-resolved and time-integrated spectra, space-resolved spectra and streaked spectra; filtered and monochromatic images; and X-ray diode signals. Simulated images and spectra can include the effects of backlighters, as well as the effects of instrumental broadening and time-gating. SPECT3D also includes a drilldown capability that shows where frequency-dependent radiation is emitted and absorbed as it propagates through the plasma towards the detector, thereby providing insights on where the radiation seen by a detector originates within the plasma. SPECT3D has the capability to model a variety of complex atomic and radiative processes that affect the radiation seen by imaging and spectral detectors in high energy density physics (HEDP) experiments. LTE (local thermodynamic equilibrium) or non-LTE atomic level populations can be computed for plasmas. Photoabsorption rates can be computed using either escape probability models or, for selected 1D and 2D geometries, multi-angle radiative transfer models. The effects of non-thermal (i.e. non-Maxwellian) electron distributions can also be included. To study the influence of energetic particles on spectra and images recorded in intense short-pulse laser experiments, the effects of both relativistic electrons and energetic proton beams can be simulated. SPECT3D is a user-friendly software package that runs on Windows, Linux, and Mac platforms. A parallel version of SPECT3D is supported for Linux clusters for large-scale calculations. We will discuss the major features of SPECT3D, and present example results from simulations and comparisons with experimental data.

  6. A final report to the Laboratory Directed Research and Development committee on Project 93-ERP-075: ``X-ray laser propagation and coherence: Diagnosing fast-evolving, high-density laser plasmas using X-ray lasers``

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, A.S.; Cauble, R.; Da Silva, L.B.

    1996-02-01

    This report summarizes the major accomplishments of this three-year Laboratory Directed Research and Development (LDRD) Exploratory Research Project (ERP) entitled ``X-ray Laser Propagation and Coherence: Diagnosing Fast-evolving, High-density Laser Plasmas Using X-ray Lasers,`` tracking code 93-ERP-075. The most significant accomplishment of this project is the demonstration of a new laser plasma diagnostic: a soft x-ray Mach-Zehnder interferometer using a neonlike yttrium x-ray laser at 155 {angstrom} as the probe source. Detailed comparisons of absolute two-dimensional electron density profiles obtained from soft x-ray laser interferograms and profiles obtained from radiation hydrodynamics codes, such as LASNEX, will allow us to validate andmore » benchmark complex numerical models used to study the physics of laser-plasma interactions. Thus the development of soft x-ray interferometry technique provides a mechanism to probe the deficiencies of the numerical models and is an important tool for, the high-energy density physics and science-based stockpile stewardship programs. The authors have used the soft x-ray interferometer to study a number of high-density, fast evolving, laser-produced plasmas, such as the dynamics of exploding foils and colliding plasmas. They are pursuing the application of the soft x-ray interferometer to study ICF-relevant plasmas, such as capsules and hohlraums, on the Nova 10-beam facility. They have also studied the development of enhanced-coherence, shorter-pulse-duration, and high-brightness x-ray lasers. The utilization of improved x-ray laser sources can ultimately enable them to obtain three-dimensional holographic images of laser-produced plasmas.« less

  7. Sub-millisecond electron density profile measurement at the JET tokamak with the fast lithium beam emission spectroscopy system

    NASA Astrophysics Data System (ADS)

    Réfy, D. I.; Brix, M.; Gomes, R.; Tál, B.; Zoletnik, S.; Dunai, D.; Kocsis, G.; Kálvin, S.; Szabolics, T.; JET Contributors

    2018-04-01

    Diagnostic alkali atom (e.g., lithium) beams are routinely used to diagnose magnetically confined plasmas, namely, to measure the plasma electron density profile in the edge and the scrape off layer region. A light splitting optics system was installed into the observation system of the lithium beam emission spectroscopy diagnostic at the Joint European Torus (JET) tokamak, which allows simultaneous measurement of the beam light emission with a spectrometer and a fast avalanche photodiode (APD) camera. The spectrometer measurement allows density profile reconstruction with ˜10 ms time resolution, absolute position calculation from the Doppler shift, spectral background subtraction as well as relative intensity calibration of the channels for each discharge. The APD system is capable of measuring light intensities on the microsecond time scale. However ˜100 μs integration is needed to have an acceptable signal to noise ratio due to moderate light levels. Fast modulation of the beam up to 30 kHz is implemented which allows background subtraction on the 100 μs time scale. The measurement covers the 0.9 < ρpol < 1.1 range with 6-10 mm optical resolution at the measurement location which translates to 3-5 mm radial resolution at the midplane due to flux expansion. An automated routine has been developed which performs the background subtraction, the relative calibration, and the comprehensive error calculation, runs a Bayesian density reconstruction code, and loads results to the JET database. The paper demonstrates the capability of the APD system by analyzing fast phenomena like pellet injection and edge localized modes.

  8. Fast-ion transport in low density L-mode plasmas at TCV using FIDA spectroscopy and the TRANSP code

    NASA Astrophysics Data System (ADS)

    Geiger, B.; Karpushov, A. N.; Duval, B. P.; Marini, C.; Sauter, O.; Andrebe, Y.; Testa, D.; Marascheck, M.; Salewski, M.; Schneider, P. A.; the TCV Team; the EUROfusion MST1 Team

    2017-11-01

    Experiments with the new neutral beam injection source of TCV have been performed with high fast-ion fractions (>20%) that exhibit a clear reduction of the loop voltage and a clear increase of the plasma pressure in on- and off-axis heating configurations. However, good quantitative agreement between the experimental data and TRANSP predictions is only found when including strong additional fast-ion losses. These losses could in part be caused by turbulence or MHD activity as, e.g. high frequency modes near the frequency of toroidicity induced Alfvén eignmodes are observed. In addition, a newly installed fast-ion D-alpha (FIDA) spectroscopy system measures strong passive radiation and, hence, indicates the presence of high background neutral densities such that charge-exchange losses are substantial. Also the active radiation measured with the FIDA diagnostic, as well as data from a neutral particle analyzer, suggest strong fast-ion losses and large neutral densities. The large neutral densities can be justified since high electron temperatures (3-4 keV), combined with low electron densities (about 2× {10}19 m-3) yield long mean free paths of the neutrals which are penetrating from the walls.

  9. 140 GHz EC waves propagation and absorption for normal/oblique injection on FTU tokamak

    NASA Astrophysics Data System (ADS)

    Nowak, S.; Airoldi, A.; Bruschi, A.; Buratti, P.; Cirant, S.; Gandini, F.; Granucci, G.; Lazzaro, E.; Panaccione, L.; Ramponi, G.; Simonetto, A.; Sozzi, C.; Tudisco, O.; Zerbini, M.

    1999-09-01

    Most of the interest in ECRH experiments is linked to the high localization of EC waves absorption in well known portions of the plasma volume. In order to take full advantage of this capability a reliable code has been developed for beam tracing and absorption calculations. The code is particularly important for oblique (poloidal and toroidal) injection, when the absorbing layer is not simply dependent on the position of the EC resonance only. An experimental estimate of the local heating power density is given by the jump in the time derivative of the local electron pressure at the switching ON of the gyrotron power. The evolution of the temperature profile increase (from ECE polychromator) during the nearly adiabatic phase is also considered for ECRH profile reconstruction. An indirect estimate of optical thickness and of the overall absorption coefficient is given by the measure of the residual e.m. power at the tokamak walls. Beam tracing code predictions of the power deposition profile are compared with experimental estimates. The impact of the finite spatial resolution of the temperature diagnostic on profile reconstruction is also discussed.

  10. Status report on the 'Merging' of the Electron-Cloud Code POSINST with the 3-D Accelerator PIC CODE WARP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vay, J.-L.; Furman, M.A.; Azevedo, A.W.

    2004-04-19

    We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.

  11. Diagnostic Coding of Abuse Related Fractures at Two Children's Emergency Departments

    ERIC Educational Resources Information Center

    Somji, Zeeshanefatema; Plint, Amy; McGahern, Candice; Al-Saleh, Ahmed; Boutis, Kathy

    2011-01-01

    Objectives: Pediatric fractures suspicious for abuse are often evaluated in emergency departments (ED), although corresponding diagnostic coding for possible abuse may be lacking. Thus, the primary objective of this study was to determine the proportion of fracture cases investigated in the ED for abuse that had corresponding International…

  12. 38 CFR 4.115a - Ratings of the genitourinary system-dysfunctions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... function; or, hypertension at least 40 percent disabling under diagnostic code 7101 60 Albumin constant or recurring with hyaline and granular casts or red blood cells; or, transient or slight edema or hypertension... nephritis; or, hypertension non-compensable under diagnostic code 7101 0 Voiding dysfunction: Rate...

  13. 38 CFR 4.115a - Ratings of the genitourinary system-dysfunctions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... function; or, hypertension at least 40 percent disabling under diagnostic code 7101 60 Albumin constant or recurring with hyaline and granular casts or red blood cells; or, transient or slight edema or hypertension... nephritis; or, hypertension non-compensable under diagnostic code 7101 0 Voiding dysfunction: Rate...

  14. 38 CFR 4.115a - Ratings of the genitourinary system-dysfunctions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... function; or, hypertension at least 40 percent disabling under diagnostic code 7101 60 Albumin constant or recurring with hyaline and granular casts or red blood cells; or, transient or slight edema or hypertension... nephritis; or, hypertension non-compensable under diagnostic code 7101 0 Voiding dysfunction: Rate...

  15. Quality of data regarding diagnoses of spinal disorders in administrative databases. A multicenter study.

    PubMed

    Faciszewski, T; Broste, S K; Fardon, D

    1997-10-01

    The purpose of the present study was to evaluate the accuracy of data regarding diagnoses of spinal disorders in administrative databases at eight different institutions. The records of 189 patients who had been managed for a disorder of the lumbar spine were independently reviewed by a physician who assigned the appropriate diagnostic codes according to the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM). The age range of the 189 patients was seventeen to eighty-four years. The six major diagnostic categories studied were herniation of a lumbar disc, a previous operation on the lumbar spine, spinal stenosis, cauda equina syndrome, acquired spondylolisthesis, and congenital spondylolisthesis. The diagnostic codes assigned by the physician were compared with the codes that had been assigned during the ordinary course of events by personnel in the medical records department of each of the eight hospitals. The accuracy of coding was also compared among the eight hospitals, and it was found to vary depending on the diagnosis. Although there were both false-negative and false-positive codes at each institution, most errors were related to the low sensitivity of coding for previous spinal operations: only seventeen (28 per cent) of sixty-one such diagnoses were coded correctly. Other errors in coding were less frequent, but their implications for conclusions drawn from the information in administrative databases depend on the frequency of a diagnosis and its importance in an analysis. This study demonstrated that the accuracy of a diagnosis of a spinal disorder recorded in an administrative database varies according to the specific condition being evaluated. It is necessary to document the relative accuracy of specific ICD-9-CM diagnostic codes in order to improve the ability to validate the conclusions derived from investigations based on administrative databases.

  16. Earth Global Reference Atmospheric Model 2007 (Earth-GRAM07)

    NASA Technical Reports Server (NTRS)

    Leslie, Fred W.; Justus, C. G.

    2008-01-01

    GRAM is a Fortran software package that can run on a variety of platforms including PC's. GRAM provides values of atmospheric quantities such as temperature, pressure, density, winds, constituents, etc. GRAM99 covers all global locations, all months, and heights from the surface to approx. 1000 km). Dispersions (perturbations) of these parameters are also provided and are spatially and temporally correlated. GRAM can be run in a stand-alone mode or called as a subroutine from a trajectory program. GRAM07 is diagnostic, not prognostic (i.e., it describes the atmosphere, but it does not forecast). The source code is distributed free-of-charge to eligible recipients.

  17. Energy transport in plasmas produced by a high brightness krypton fluoride laser focused to a line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Hadithi, Y.; Tallents, G.J.; Zhang, J.

    A high brightness krypton fluoride Raman laser (wavelength 0.268 [mu]m) generating 0.3 TW, 12 ps pulses with 20 [mu]rad beam divergence and a prepulse of less than 10[sup [minus]10] has been focused to produce a 10 [mu]m wide line focus (irradiances [similar to]0.8--4[times]10[sup 15] W cm[sup [minus]2]) on plastic targets with a diagnostic sodium fluoride (NaF) layer buried within the target. Axial and lateral transport of energy has been measured by analysis of x-ray images of the line focus and from x-ray spectra emitted by the layer of NaF with varying overlay thicknesses. It is shown that the ratio ofmore » the distance between the critical density surface and the ablation surface to the laser focal width controls lateral transport in a similar manner as for previous spot focus experiments. The measured axial energy transport is compared to MEDUSA [J. P. Christiansen, D. E. T. F. Ashby, and K. V. Roberts, Comput. Phys. Commun. [bold 7], 271 (1974)] one-dimensional hydrodynamic code simulations with an average atom post-processor for predicting spectral line intensities. An energy absorption of [similar to]10% in the code gives agreement with the experimental axial penetration. Various measured line ratios of hydrogen- and helium-like Na and F are investigated as temperature diagnostics in the NaF layer using the RATION [R. W. Lee, B. L. Whitten, and R. E. Strout, J. Quant. Spectrosc. Radiat. Transfer [bold 32], 91 (1984)] code.« less

  18. Nearest Neighbor Classification Using a Density Sensitive Distance Measurement

    DTIC Science & Technology

    2009-09-01

    both the proposed density sensitive distance measurement and Euclidean distance are compared on the Wisconsin Diagnostic Breast Cancer dataset and...proposed density sensitive distance measurement and Euclidean distance are compared on the Wisconsin Diagnostic Breast Cancer dataset and the MNIST...35 1. The Wisconsin Diagnostic Breast Cancer (WDBC) Dataset..........35 2. The

  19. The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error

    PubMed Central

    Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G

    2012-01-01

    Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908

  20. 1 D analysis of Radiative Shock damping by lateral radiative losses.

    NASA Astrophysics Data System (ADS)

    Busquet, Michel; Colombier, Jean-Philippe; Stehle, Chantal

    2007-11-01

    It has been shown theoretically and experimentally [1] that the radiative precursor in front of a strong shock in hi-Z material is slowed down by lateral radiative losses. The 2D simulation showed that the shock front and the precursor front remain planar, with an increase of density and a decrease of temperature close to the walls. The damping of the precursor is obviously sensitive to the fraction of self-emitted radiation reflected by the walls (the albedo). In order to perform parametric studies we include the albedo controlled lateral radiative losses in the 1D hydro-code MULTI (created by Ramis et al [2]) both in terms of energy balance and of spectral diagnostic. [1] Gonzales et al, Laser Part. Beams 24, 1-6 (2006) ; Busquet et al, High Energy Density Physics (2007), doi: 10.1016/j.hedp.2007.01.002 [2] Ramis et al, Comp. Phys. Comm., 49 (1988), 475

  1. Crosstalk eliminating and low-density parity-check codes for photochromic dual-wavelength storage

    NASA Astrophysics Data System (ADS)

    Wang, Meicong; Xiong, Jianping; Jian, Jiqi; Jia, Huibo

    2005-01-01

    Multi-wavelength storage is an approach to increase the memory density with the problem of crosstalk to be deal with. We apply Low Density Parity Check (LDPC) codes as error-correcting codes in photochromic dual-wavelength optical storage based on the investigation of LDPC codes in optical data storage. A proper method is applied to reduce the crosstalk and simulation results show that this operation is useful to improve Bit Error Rate (BER) performance. At the same time we can conclude that LDPC codes outperform RS codes in crosstalk channel.

  2. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  3. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  4. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  5. Species and temperature measurement in H2/O2 rocket flow fields by means of Raman scattering diagnostics

    NASA Technical Reports Server (NTRS)

    Degroot, Wim A.; Weiss, Jonathan M.

    1992-01-01

    Validation of Computational Fluid Dynamics (CFD) codes developed for prediction and evaluation of rocket performance is hampered by a lack of experimental data. Non-intrusive laser based diagnostics are needed to provide spatially and temporally resolved gas dynamic and fluid dynamic measurements. This paper reports the first non-intrusive temperature and species measurements in the plume of a 110 N gaseous hydrogen/oxygen thruster at and below ambient pressures, obtained with spontaneous Raman spectroscopy. Measurements at 10 mm downstream of the exit plane are compared with predictions from a numerical solution of the axisymmetric Navier-Stokes and species transport equations with chemical kinetics, which fully model the combustor-nozzle-plume flowfield. The experimentally determined oxygen number density at the centerline at 10 mm downstream of the exit plane is four times that predicted by the model. The experimental number density data fall between those numerically predicted for the exit and 10 mm downstream planes in both magnitude and radial gradient. The predicted temperature levels are within 10 to 15 percent of measured values. Some of the discrepancies between experimental data and predictions result from not modeling the three dimensional core flow injection mixing process, facility back pressure effects, and possible diffuser-thruster interactions.

  6. Low-Density Parity-Check (LDPC) Codes Constructed from Protographs

    NASA Astrophysics Data System (ADS)

    Thorpe, J.

    2003-08-01

    We introduce a new class of low-density parity-check (LDPC) codes constructed from a template called a protograph. The protograph serves as a blueprint for constructing LDPC codes of arbitrary size whose performance can be predicted by analyzing the protograph. We apply standard density evolution techniques to predict the performance of large protograph codes. Finally, we use a randomized search algorithm to find good protographs.

  7. Dissemination and support of ARGUS for accelerator applications. Technical progress report, April 24, 1991--January 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  8. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... diagnostic laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding...

  9. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... diagnostic laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding...

  10. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... diagnostic laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding...

  11. Visual Dysfunction Following Blast-Related Traumatic Brain Injury from the Battlefield

    DTIC Science & Technology

    2011-01-01

    and visual disorders is varied, depending on the diagnostic criteria, condition and patient popu- lation, but has primarily been studied in civilian... diagnostic codes for ‘disorders of the eye and adnexa’ (360.0– 379.9) obtained from electronic outpatient medical records (Standard Ambulatory Data Record) and...disorder diagnostic category by TBI status. ICD-9-CM code and categorya TBI (n¼ 837) Other injury (n¼1417) 360 Disorders of the globe 0 1 ɘ.1% 361

  12. Visual Dysfunction Following Blast-Related Traumatic Brain Injury from the Battlefield

    DTIC Science & Technology

    2010-10-27

    sequelae follow- ing a TBI [12, 13]. The occurrence of TBI-related ocular and visual disorders is varied, depending on the diagnostic criteria...measure, ocular/visual disor- der, was indicated by the ICD-9-CM diagnostic codes for ‘disorders of the eye and adnexa’ (360.0– 379.9) obtained from...II. Number and percentage of US service members in each ocular/visual disorder diagnostic category by TBI status. ICD-9-CM code and categorya TBI (n

  13. VLSI (Very Large Scale Integrated Circuits) Design with the MacPitts Silicon Compiler.

    DTIC Science & Technology

    1985-09-01

    the background. If the algorithm is not fully debugged, then issue instead macpitts basename herald so MacPitts diagnostics and Liszt diagnostics both...command interpreter. Upon compilation, however, the following LI!F compiler ( Liszt ) diagnostic results, Error: Non-number to minus nil where the first...language used in the MacPitts source code. The more instructive solution is to write the Franz LISP code to decide if a jumper wire is needed, and if so, to

  14. Opacity of iron, nickel, and copper plasmas in the x-ray wavelength range: Theoretical interpretation of 2p-3d absorption spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blenski, T.; Loisel, G.; Poirier, M.

    2011-09-15

    This paper deals with theoretical studies on the 2p-3d absorption in iron, nickel, and copper plasmas related to LULI2000 (Laboratoire pour l'Utilisation des Lasers Intenses, 2000J facility) measurements in which target temperatures were of the order of 20 eV and plasma densities were in the range 0.004-0.01 g/cm{sup 3}. The radiatively heated targets were close to local thermodynamic equilibrium (LTE). The structure of 2p-3d transitions has been studied with the help of the statistical superconfiguration opacity code sco and with the fine-structure atomic physics codes hullac and fac. A new mixed version of the sco code allowing one to treatmore » part of the configurations by detailed calculation based on the Cowan's code rcg has been also used in these comparisons. Special attention was paid to comparisons between theory and experiment concerning the term features which cannot be reproduced by sco. The differences in the spin-orbit splitting and the statistical (thermal) broadening of the 2p-3d transitions have been investigated as a function of the atomic number Z. It appears that at the conditions of the experiment the role of the term and configuration broadening was different in the three analyzed elements, this broadening being sensitive to the atomic number. Some effects of the temperature gradients and possible non-LTE effects have been studied with the help of the radiative-collisional code scric. The sensitivity of the 2p-3d structures with respect to temperature and density in medium-Z plasmas may be helpful for diagnostics of LTE plasmas especially in future experiments on the {Delta}n=0 absorption in medium-Z plasmas for astrophysical applications.« less

  15. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    PubMed

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  16. LLE Review Quarterly Report (January-March 1999). Volume 78

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regan, Sean P.

    1999-03-01

    This volume of the LLE Review, covering the period January-March 1999, features two articles concerning issues relevant to 2-D SSD laser-beam smoothing on OMEGA. In the first article J. D. Zuegel and J. A. Marozas present the design of an efficient, bulk phase modulator operating at approximately 10.5 GHz, which can produce substantial phase-modulated bandwidth with modest microwave drive power. This modulator is the cornerstone of the 1-THz UV bandwidth operation planned for OMEGA this year. In the second article J. A. Marozas and J. H. Kelly describe a recently developed code -- Waasese -- that simulates the collective behaviormore » of the optical components in the SSD driver line. The measurable signatures predicted by the code greatly enhance the diagnostic capability of the SSD driver line. Other articles in this volume are titled: Hollow-Shell Implosion Studies on the 60-Beam, UC OMEGA Laser System; Simultaneous Measurements of Fuel Areal Density, Shell Areal Density, and Fuel Temperature in D 3He-Filled Imploding Capsules; The Design of Optical Pulse Shapes with an Aperture-Coupled-Stripline Pulse-Shaping System; Measurement Technique for Characterization of Rapidly Time- and Frequency-Varying Electronic Devices; and, Damage to Fused-Silica, Spatial-Filter Lenses on the OMEGA Laser System.« less

  17. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.

    2017-10-01

    FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.

  18. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  19. Tobacco outlet density and converted versus native non-daily cigarette use in a national US sample

    PubMed Central

    Kirchner, Thomas R; Anesetti-Rothermel, Andrew; Bennett, Morgane; Gao, Hong; Carlos, Heather; Scheuermann, Taneisha S; Reitzel, Lorraine R; Ahluwalia, Jasjit S

    2017-01-01

    Objective Investigate whether non-daily smokers’ (NDS) cigarette price and purchase preferences, recent cessation attempts, and current intentions to quit are associated with the density of the retail cigarette product landscape surrounding their residential address. Participants Cross-sectional assessment of N=904 converted NDS (CNDS). who previously smoked every day, and N=297 native NDS (NNDS) who only smoked non-daily, drawn from a national panel. Outcome measures Kernel density estimation was used to generate a nationwide probability surface of tobacco outlets linked to participants’ residential ZIP code. Hierarchically nested log-linear models were compared to evaluate associations between outlet density, non-daily use patterns, price sensitivity and quit intentions. Results Overall, NDS in ZIP codes with greater outlet density were less likely than NDS in ZIP codes with lower outlet density to hold 6-month quit intentions when they also reported that price affected use patterns (G2=66.1, p<0.001) and purchase locations (G2=85.2, p<0.001). CNDS were more likely than NNDS to reside in ZIP codes with higher outlet density (G2=322.0, p<0.001). Compared with CNDS in ZIP codes with lower outlet density, CNDS in high-density ZIP codes were more likely to report that price influenced the amount they smoke (G2=43.9, p<0.001), and were more likely to look for better prices (G2=59.3, p<0.001). NDS residing in high-density ZIP codes were not more likely to report that price affected their cigarette brand choice compared with those in ZIP codes with lower density. Conclusions This paper provides initial evidence that the point-of-sale cigarette environment may be differentially associated with the maintenance of CNDS versus NNDS patterns. Future research should investigate how tobacco control efforts can be optimised to both promote cessation and curb the rising tide of non-daily smoking in the USA. PMID:26969172

  20. An embedded barcode for "connected" malaria rapid diagnostic tests.

    PubMed

    Scherr, Thomas F; Gupta, Sparsh; Wright, David W; Haselton, Frederick R

    2017-03-29

    Many countries are shifting their efforts from malaria control to disease elimination. New technologies will be necessary to meet the more stringent demands of elimination campaigns, including improved quality control of malaria diagnostic tests, as well as an improved means for communicating test results among field healthcare workers, test manufacturers, and national ministries of health. In this report, we describe and evaluate an embedded barcode within standard rapid diagnostic tests as one potential solution. This information-augmented diagnostic test operates on the familiar principles of traditional lateral flow assays and simply replaces the control line with a control grid patterned in the shape of a QR (quick response) code. After the test is processed, the QR code appears on both positive or negative tests. In this report we demonstrate how this multipurpose code can be used not only to fulfill the control line role of test validation, but also to embed test manufacturing details, serve as a trigger for image capture, enable registration for image analysis, and correct for lighting effects. An accompanying mobile phone application automatically captures an image of the test when the QR code is recognized, decodes the QR code, performs image processing to determine the concentration of the malarial biomarker histidine-rich protein 2 at the test line, and transmits the test results and QR code payload to a secure web portal. This approach blends automated, sub-nanomolar biomarker detection, with near real-time reporting to provide quality assurance data that will help to achieve malaria elimination.

  1. The identification of unfolding facial expressions.

    PubMed

    Fiorentini, Chiara; Schmidt, Susanna; Viviani, Paolo

    2012-01-01

    We asked whether the identification of emotional facial expressions (FEs) involves the simultaneous perception of the facial configuration or the detection of emotion-specific diagnostic cues. We recorded at high speed (500 frames s-1) the unfolding of the FE in five actors, each expressing six emotions (anger, surprise, happiness, disgust, fear, sadness). Recordings were coded every 10 frames (20 ms of real time) with the Facial Action Coding System (FACS, Ekman et al 2002, Salt Lake City, UT: Research Nexus eBook) to identify the facial actions contributing to each expression, and their intensity changes over time. Recordings were shown in slow motion (1/20 of recording speed) to one hundred observers in a forced-choice identification task. Participants were asked to identify the emotion during the presentation as soon as they felt confident to do so. Responses were recorded along with the associated response times (RTs). The RT probability density functions for both correct and incorrect responses were correlated with the facial activity during the presentation. There were systematic correlations between facial activities, response probabilities, and RT peaks, and significant differences in RT distributions for correct and incorrect answers. The results show that a reliable response is possible long before the full FE configuration is reached. This suggests that identification is reached by integrating in time individual diagnostic facial actions, and does not require perceiving the full apex configuration.

  2. Novel analysis technique for measuring edge density fluctuation profiles with reflectometry in the Large Helical Device.

    PubMed

    Creely, A J; Ida, K; Yoshinuma, M; Tokuzawa, T; Tsujimura, T; Akiyama, T; Sakamoto, R; Emoto, M; Tanaka, K; Michael, C A

    2017-07-01

    A new method for measuring density fluctuation profiles near the edge of plasmas in the Large Helical Device (LHD) has been developed utilizing reflectometry combined with pellet-induced fast density scans. Reflectometer cutoff location was calculated by proportionally scaling the cutoff location calculated with fast far infrared laser interferometer (FIR) density profiles to match the slower time resolution results of the ray-tracing code LHD-GAUSS. Plasma velocity profile peaks generated with this reflectometer mapping were checked against velocity measurements made with charge exchange spectroscopy (CXS) and were found to agree within experimental uncertainty once diagnostic differences were accounted for. Measured density fluctuation profiles were found to peak strongly near the edge of the plasma, as is the case in most tokamaks. These measurements can be used in the future to inform inversion methods of phase contrast imaging (PCI) measurements. This result was confirmed with both a fixed frequency reflectometer and calibrated data from a multi-frequency comb reflectometer, and this method was applied successfully to a series of discharges. The full width at half maximum of the turbulence layer near the edge of the plasma was found to be only 1.5-3 cm on a series of LHD discharges, less than 5% of the normalized minor radius.

  3. Novel analysis technique for measuring edge density fluctuation profiles with reflectometry in the Large Helical Device

    NASA Astrophysics Data System (ADS)

    Creely, A. J.; Ida, K.; Yoshinuma, M.; Tokuzawa, T.; Tsujimura, T.; Akiyama, T.; Sakamoto, R.; Emoto, M.; Tanaka, K.; Michael, C. A.

    2017-07-01

    A new method for measuring density fluctuation profiles near the edge of plasmas in the Large Helical Device (LHD) has been developed utilizing reflectometry combined with pellet-induced fast density scans. Reflectometer cutoff location was calculated by proportionally scaling the cutoff location calculated with fast far infrared laser interferometer (FIR) density profiles to match the slower time resolution results of the ray-tracing code LHD-GAUSS. Plasma velocity profile peaks generated with this reflectometer mapping were checked against velocity measurements made with charge exchange spectroscopy (CXS) and were found to agree within experimental uncertainty once diagnostic differences were accounted for. Measured density fluctuation profiles were found to peak strongly near the edge of the plasma, as is the case in most tokamaks. These measurements can be used in the future to inform inversion methods of phase contrast imaging (PCI) measurements. This result was confirmed with both a fixed frequency reflectometer and calibrated data from a multi-frequency comb reflectometer, and this method was applied successfully to a series of discharges. The full width at half maximum of the turbulence layer near the edge of the plasma was found to be only 1.5-3 cm on a series of LHD discharges, less than 5% of the normalized minor radius.

  4. Community Alcohol Outlet Density and Underage Drinking

    PubMed Central

    Chen, Meng-Jinn; Grube, Joel W.; Gruenewald, Paul J.

    2009-01-01

    Aim This study examined how community alcohol outlet density may be associated with drinking among youths. Methods Longitudinal data were collected from 1091 adolescents (aged 14–16 at baseline) recruited from 50 zip codes in California with varying levels of alcohol outlet density and median household income. Hierarchical linear models were used to examine the associations between zip code alcohol outlet density and frequency rates of general alcohol use and excessive drinking, taking into account zip code median household income and individual-level variables (age, gender, race/ethnicity, personal income, mobility, and perceived drinking by parents and peers). Findings When all other factors were controlled, higher initial levels of drinking and excessive drinking were observed among youths residing in zip codes with higher alcohol outlet densities. Growth in drinking and excessive drinking was on average more rapid in zip codes with lower alcohol outlet densities. The relation of zip code alcohol outlet density with drinking appeared to be mitigated by having friends with access to a car. Conclusion Alcohol outlet density may play a significant role in initiation of underage drinking during early teen ages, especially when youths have limited mobility. Youth who reside in areas with low alcohol outlet density may overcome geographic constraints through social networks that increase their mobility and the ability to seek alcohol and drinking opportunities beyond the local community. PMID:20078485

  5. Design calculations for NIF convergent ablator experiments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callahan, Debra; Leeper, Ramon Joe; Spears, B. K.

    2010-11-01

    Design calculations for NIF convergent ablator experiments will be described. The convergent ablator experiments measure the implosion trajectory, velocity, and ablation rate of an x-ray driven capsule and are a important component of the U. S. National Ignition Campaign at NIF. The design calculations are post-processed to provide simulations of the key diagnostics: (1) Dante measurements of hohlraum x-ray flux and spectrum, (2) streaked radiographs of the imploding ablator shell, (3) wedge range filter measurements of D-He3 proton output spectra, and (4) GXD measurements of the imploded core. The simulated diagnostics will be compared to the experimental measurements to providemore » an assessment of the accuracy of the design code predictions of hohlraum radiation temperature, capsule ablation rate, implosion velocity, shock flash areal density, and x-ray bang time. Post-shot versions of the design calculations are used to enhance the understanding of the experimental measurements and will assist in choosing parameters for subsequent shots and the path towards optimal ignition capsule tuning.« less

  6. Estimation of Electron Bernstein Emission in the TJ-II Stellarator

    NASA Astrophysics Data System (ADS)

    García-Rega\\ Na, J. M.; Cappa, A.; Castejón, F.; Tereshchenko, M.

    2009-05-01

    In order to study experimentally the viability of first harmonic EBW heating in the TJ-II stellarator by means of the O-X-B tecnique [1], an EBE diagnostic was recently installed [2]. In the present work a theoretical estimation of the EBW radiation in the TJ-II plasmas have been carried out making use of the ray tracing code TRUBA [3]. The line of sight of the EBE diagnostic may be modified using an internal movable mirror and therefore, for comparison with the experimental results, the theoretical O-X-B emission window has been determined. Experimental density and temperature profiles obtained in NBI discharges are considered in the simulations.References:[1] F. Castejon et al, Nucl. Fusion 48, 075011 (2008).[2] J. Caughman et al, Proc. 15th Joint Workshop on ECE and ECRH, Yosemite, USA (2008).[3] M. A. Tereshchenko et. al, Proc. 30th EPS Conference on Contr. Fusion and Plasma Phys., 27A, P-1.18 (2003).

  7. Schlieren, Phase-Contrast, and Spectroscopy Diagnostics for the LBNL HIF Plasma Channel Experiment

    NASA Astrophysics Data System (ADS)

    Ponce, D. M.; Niemann, C.; Fessenden, T. J.; Leemans, W.; Vandersloot, K.; Dahlbacka, G.; Yu, S. S.; Sharp, W. M.; Tauschwitz, A.

    1999-11-01

    The LBNL Plasma Channel experiment has demonstrated stable 42-cm Z-pinch discharge plasma channels with peak currents in excess of 50 kA for a 7 torr nitrogen, 30 kV discharge. These channels offer the possibility of transporting heavy-ion beams for inertial fusion. We postulate that the stability of these channels resides in the existance of a neutral-gas density depresion created by a pre-pulse discharge before the main capacitor bank discharge is created. Here, we present the results and experimental diagnostics setup used for the study of the pre-pulse and main bank channels. Observation of both the plasma and neutral gas dynamics is achieved. Schlieren, Zernike's phase-contrast, and spectroscopic techniques are used. Preliminary Schlieren results show a gas shockwave moving radially at a rate of ≈ 10^6 mm/sec as a result of the fast and localized deposited energy during the evolution of the pre-pulse channel. This data will be used to validate simulation codes (BUCKY and CYCLOPS).

  8. Direct measurements and comparisons between deuterium and impurity rotation and density profiles in the H-mode steep gradient region on DIII-D

    NASA Astrophysics Data System (ADS)

    Haskey, S. R.; Grierson, B. A.; Chrystal, C.; Stagner, L.; Burrell, K.; Groebner, R. J.; Kaplan, D. H.; Nazikian, R.

    2016-10-01

    The recently commissioned edge deuterium charge exchange recombination (CER) spectroscopy diagnostic on DIII-D is providing direct measurements of the deuterium rotation, temperature, and density in H-mode pedestals. The deuterium temperature and temperature scale length can be 50 % lower than the carbon measurement in the gradient region of the pedestal, indicating that the ion pedestal pressure can deviate significantly from that inferred from carbon CER. In addition, deuterium exhibits a larger toroidal rotation in the co-Ip direction near the separatrix compared with the carbon. These differences are qualitatively consistent with theory-based models that identify thermal ion orbit loss across the separatrix as a source of intrinsic angular momentum. The first direct measurements of the deuterium density pedestal profile show an inward shift of the impurity pedestal compared with the main ions, validating neoclassical predictions from the XGC0 code. Work supported by the U.S. DOE under DE-FC02-04ER54698 and DE-AC02-09CH11466.

  9. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  10. Two high-density recording methods with run-length limited turbo code for holographic data storage system

    NASA Astrophysics Data System (ADS)

    Nakamura, Yusuke; Hoshizawa, Taku

    2016-09-01

    Two methods for increasing the data capacity of a holographic data storage system (HDSS) were developed. The first method is called “run-length-limited (RLL) high-density recording”. An RLL modulation has the same effect as enlarging the pixel pitch; namely, it optically reduces the hologram size. Accordingly, the method doubles the raw-data recording density. The second method is called “RLL turbo signal processing”. The RLL turbo code consists of \\text{RLL}(1,∞ ) trellis modulation and an optimized convolutional code. The remarkable point of the developed turbo code is that it employs the RLL modulator and demodulator as parts of the error-correction process. The turbo code improves the capability of error correction more than a conventional LDPC code, even though interpixel interference is generated. These two methods will increase the data density 1.78-fold. Moreover, by simulation and experiment, a data density of 2.4 Tbit/in.2 is confirmed.

  11. Oblique shock structures formed during the ablation phase of aluminium wire array z-pinches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swadling, G. F.; Lebedev, S. V.; Niasse, N.

    A series of experiments has been conducted in order to investigate the azimuthal structures formed by the interactions of cylindrically converging plasma flows during the ablation phase of aluminium wire array Z pinch implosions. These experiments were carried out using the 1.4 MA, 240 ns MAGPIE generator at Imperial College London. The main diagnostic used in this study was a two-colour, end-on, Mach-Zehnder imaging interferometer, sensitive to the axially integrated electron density of the plasma. The data collected in these experiments reveal the strongly collisional dynamics of the aluminium ablation streams. The structure of the flows is dominated by amore » dense network of oblique shock fronts, formed by supersonic collisions between adjacent ablation streams. An estimate for the range of the flow Mach number (M = 6.2-9.2) has been made based on an analysis of the observed shock geometry. Combining this measurement with previously published Thomson Scattering measurements of the plasma flow velocity by Harvey-Thompson et al.[Physics of Plasmas 19, 056303 (2012)] allowed us to place limits on the range of the ZT{sub e} of the plasma. The detailed and quantitative nature of the dataset lends itself well as a source for model validation and code verification exercises, as the exact shock geometry is sensitive to many of the plasma parameters. Comparison of electron density data produced through numerical modelling with the Gorgon 3D MHD code demonstrates that the code is able to reproduce the collisional dynamics observed in aluminium arrays reasonably well.« less

  12. Development and Validation of a Natural Language Processing Tool to Identify Patients Treated for Pneumonia across VA Emergency Departments.

    PubMed

    Jones, B E; South, B R; Shao, Y; Lu, C C; Leng, J; Sauer, B C; Gundlapalli, A V; Samore, M H; Zeng, Q

    2018-01-01

    Identifying pneumonia using diagnosis codes alone may be insufficient for research on clinical decision making. Natural language processing (NLP) may enable the inclusion of cases missed by diagnosis codes. This article (1) develops a NLP tool that identifies the clinical assertion of pneumonia from physician emergency department (ED) notes, and (2) compares classification methods using diagnosis codes versus NLP against a gold standard of manual chart review to identify patients initially treated for pneumonia. Among a national population of ED visits occurring between 2006 and 2012 across the Veterans Affairs health system, we extracted 811 physician documents containing search terms for pneumonia for training, and 100 random documents for validation. Two reviewers annotated span- and document-level classifications of the clinical assertion of pneumonia. An NLP tool using a support vector machine was trained on the enriched documents. We extracted diagnosis codes assigned in the ED and upon hospital discharge and calculated performance characteristics for diagnosis codes, NLP, and NLP plus diagnosis codes against manual review in training and validation sets. Among the training documents, 51% contained clinical assertions of pneumonia; in the validation set, 9% were classified with pneumonia, of which 100% contained pneumonia search terms. After enriching with search terms, the NLP system alone demonstrated a recall/sensitivity of 0.72 (training) and 0.55 (validation), and a precision/positive predictive value (PPV) of 0.89 (training) and 0.71 (validation). ED-assigned diagnostic codes demonstrated lower recall/sensitivity (0.48 and 0.44) but higher precision/PPV (0.95 in training, 1.0 in validation); the NLP system identified more "possible-treated" cases than diagnostic coding. An approach combining NLP and ED-assigned diagnostic coding classification achieved the best performance (sensitivity 0.89 and PPV 0.80). System-wide application of NLP to clinical text can increase capture of initial diagnostic hypotheses, an important inclusion when studying diagnosis and clinical decision-making under uncertainty. Schattauer GmbH Stuttgart.

  13. Analysis of a tungsten sputtering experiment in DIII-D and code/data validation of high redeposition/reduced erosion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R.; Brooks, J. N.; Elder, J. D.

    2015-03-29

    We analyze a DIII-D tokamak experiment where two tungsten spots on the removable DiMES divertor probe were exposed to 12 s of attached plasma conditions, with moderate strike point temperature and density (~20 eV, ~4.5 × 10 19 m –3), and 3% carbon impurity content. Both very small (1 mm diameter) and small (1 cm diameter) deposited samples were used for assessing gross and net tungsten sputtering erosion. The analysis uses a 3-D erosion/redeposition code package (REDEP/WBC), with input from a diagnostic-calibrated near-surface plasma code (OEDGE), and with focus on charge state resolved impinging carbon ion flux and energy. Themore » tungsten surfaces are primarily sputtered by the carbon, in charge states +1 to +4. We predict high redeposition (~75%) of sputtered tungsten on the 1 cm spot—with consequent reduced net erosion—and this agrees well with post-exposure DiMES probe RBS analysis data. As a result, this study and recent related work is encouraging for erosion lifetime and non-contamination performance of tokamak reactor high-Z plasma facing components.« less

  14. Study of shock waves and related phenomena motivated by astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, R. P.; Keiter, P. A.; Kuranz, C. C.

    This study discusses the recent research in High-Energy-Density Physics at our Center. Our work in complex hydrodynamics is now focused on mode coupling in the Richtmyer-Meshkov process and on the supersonic Kelvin-Helmholtz instability. These processes are believed to occur in a wide range of astrophysical circumstances. In radiation hydrodynamics, we are studying radiative reverse shocks relevant to cataclysmic variable stars. Our work on magnetized flows seeks to produce magnetized jets and study their interactions. We build the targets for all these experiments, and simulate them using our CRASH code. We also conduct diagnostic research, focused primarily on imaging x-ray spectroscopymore » and its applications to scattering and fluorescence.« less

  15. Study of shock waves and related phenomena motivated by astrophysics

    DOE PAGES

    Drake, R. P.; Keiter, P. A.; Kuranz, C. C.; ...

    2016-04-01

    This study discusses the recent research in High-Energy-Density Physics at our Center. Our work in complex hydrodynamics is now focused on mode coupling in the Richtmyer-Meshkov process and on the supersonic Kelvin-Helmholtz instability. These processes are believed to occur in a wide range of astrophysical circumstances. In radiation hydrodynamics, we are studying radiative reverse shocks relevant to cataclysmic variable stars. Our work on magnetized flows seeks to produce magnetized jets and study their interactions. We build the targets for all these experiments, and simulate them using our CRASH code. We also conduct diagnostic research, focused primarily on imaging x-ray spectroscopymore » and its applications to scattering and fluorescence.« less

  16. 76 FR 49491 - Medicare Program; Section 3113: The Treatment of Certain Complex Diagnostic Laboratory Tests...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ...] Medicare Program; Section 3113: The Treatment of Certain Complex Diagnostic Laboratory Tests Demonstration... code under the Treatment of Certain Complex Diagnostic Laboratory Tests Demonstration. The deadline for... interested parties of an opportunity to participate in the Treatment of Certain Complex Diagnostic Laboratory...

  17. Phase-space dependent critical gradient behavior of fast-ion transport due to Alfvén eigenmodes

    DOE PAGES

    Collins, C. S.; Heidbrink, W. W.; Podestà, M.; ...

    2017-06-09

    Experiments in the DIII-D tokamak show that many overlapping small-amplitude Alfv en eigenmodes (AEs) cause fast-ion transport to sharply increase above a critical threshold, leading to fast-ion density profile resilience and reduced fusion performance. The threshold is above the AE linear stability limit and varies between diagnostics that are sensitive to different parts of fast-ion phase-space. A comparison with theoretical analysis using the nova and orbit codes shows that, for the neutral particle diagnostic, the threshold corresponds to the onset of stochastic particle orbits due to wave-particle resonances with AEs in the measured region of phase space. We manipulated themore » bulk fast-ion distribution and instability behavior through variations in beam deposition geometry, and no significant differences in the onset threshold outside of measurement uncertainties were found, in agreement with the theoretical stochastic threshold analysis. Simulations using the `kick model' produce beam ion density gradients consistent with the empirically measured radial critical gradient and highlight the importance of including the energy and pitch dependence of the fast-ion distribution function in critical gradient models. The addition of electron cyclotron heating changes the types of AEs present in the experiment, comparatively increasing the measured fast-ion density and radial gradient. Our studies provide the basis for understanding how to avoid AE transport that can undesirably redistribute current and cause fast-ion losses, and the measurements are being used to validate AE-induced transport models that use the critical gradient paradigm, giving greater confidence when applied to ITER.« less

  18. Phase-space dependent critical gradient behavior of fast-ion transport due to Alfvén eigenmodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, C. S.; Heidbrink, W. W.; Podestà, M.

    Experiments in the DIII-D tokamak show that many overlapping small-amplitude Alfv en eigenmodes (AEs) cause fast-ion transport to sharply increase above a critical threshold, leading to fast-ion density profile resilience and reduced fusion performance. The threshold is above the AE linear stability limit and varies between diagnostics that are sensitive to different parts of fast-ion phase-space. A comparison with theoretical analysis using the nova and orbit codes shows that, for the neutral particle diagnostic, the threshold corresponds to the onset of stochastic particle orbits due to wave-particle resonances with AEs in the measured region of phase space. We manipulated themore » bulk fast-ion distribution and instability behavior through variations in beam deposition geometry, and no significant differences in the onset threshold outside of measurement uncertainties were found, in agreement with the theoretical stochastic threshold analysis. Simulations using the `kick model' produce beam ion density gradients consistent with the empirically measured radial critical gradient and highlight the importance of including the energy and pitch dependence of the fast-ion distribution function in critical gradient models. The addition of electron cyclotron heating changes the types of AEs present in the experiment, comparatively increasing the measured fast-ion density and radial gradient. Our studies provide the basis for understanding how to avoid AE transport that can undesirably redistribute current and cause fast-ion losses, and the measurements are being used to validate AE-induced transport models that use the critical gradient paradigm, giving greater confidence when applied to ITER.« less

  19. Throughput Optimization Via Adaptive MIMO Communications

    DTIC Science & Technology

    2006-05-30

    End-to-end matlab packet simulation platform. * Low density parity check code (LDPCC). * Field trials with Silvus DSP MIMO testbed. * High mobility...incorporate advanced LDPC (low density parity check) codes . Realizing that the power of LDPC codes come at the price of decoder complexity, we also...Channel Coding Binary Convolution Code or LDPC Packet Length 0 - 216-1, bytes Coding Rate 1/2, 2/3, 3/4, 5/6 MIMO Channel Training Length 0 - 4, symbols

  20. Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER

    NASA Astrophysics Data System (ADS)

    Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena

    2015-11-01

    Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.

  1. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  2. Tobacco outlet density and converted versus native non-daily cigarette use in a national US sample.

    PubMed

    Kirchner, Thomas R; Anesetti-Rothermel, Andrew; Bennett, Morgane; Gao, Hong; Carlos, Heather; Scheuermann, Taneisha S; Reitzel, Lorraine R; Ahluwalia, Jasjit S

    2017-01-01

    Investigate whether non-daily smokers' (NDS) cigarette price and purchase preferences, recent cessation attempts, and current intentions to quit are associated with the density of the retail cigarette product landscape surrounding their residential address. Cross-sectional assessment of N=904 converted NDS (CNDS). who previously smoked every day, and N=297 native NDS (NNDS) who only smoked non-daily, drawn from a national panel. Kernel density estimation was used to generate a nationwide probability surface of tobacco outlets linked to participants' residential ZIP code. Hierarchically nested log-linear models were compared to evaluate associations between outlet density, non-daily use patterns, price sensitivity and quit intentions. Overall, NDS in ZIP codes with greater outlet density were less likely than NDS in ZIP codes with lower outlet density to hold 6-month quit intentions when they also reported that price affected use patterns (G 2 =66.1, p<0.001) and purchase locations (G 2 =85.2, p<0.001). CNDS were more likely than NNDS to reside in ZIP codes with higher outlet density (G 2 =322.0, p<0.001). Compared with CNDS in ZIP codes with lower outlet density, CNDS in high-density ZIP codes were more likely to report that price influenced the amount they smoke (G 2 =43.9, p<0.001), and were more likely to look for better prices (G 2 =59.3, p<0.001). NDS residing in high-density ZIP codes were not more likely to report that price affected their cigarette brand choice compared with those in ZIP codes with lower density. This paper provides initial evidence that the point-of-sale cigarette environment may be differentially associated with the maintenance of CNDS versus NNDS patterns. Future research should investigate how tobacco control efforts can be optimised to both promote cessation and curb the rising tide of non-daily smoking in the USA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. Talbot-Lau x-ray deflectometer electron density diagnostic for laser and pulsed power high energy density plasma experiments (invited).

    PubMed

    Valdivia, M P; Stutman, D; Stoeckl, C; Mileham, C; Begishev, I A; Theobald, W; Bromage, J; Regan, S P; Klein, S R; Muñoz-Cordovez, G; Vescovi, M; Valenzuela-Villaseca, V; Veloso, F

    2016-11-01

    Talbot-Lau X-ray deflectometry (TXD) has been developed as an electron density diagnostic for High Energy Density (HED) plasmas. The technique can deliver x-ray refraction, attenuation, elemental composition, and scatter information from a single Moiré image. An 8 keV Talbot-Lau interferometer was deployed using laser and x-pinch backlighters. Grating survival and electron density mapping were demonstrated for 25-29 J, 8-30 ps laser pulses using copper foil targets. Moiré pattern formation and grating survival were also observed using a copper x-pinch driven at 400 kA, ∼1 kA/ns. These results demonstrate the potential of TXD as an electron density diagnostic for HED plasmas.

  4. Talbot-Lau X-ray Deflectometer electron density diagnostic for laser and pulsed power high energy density plasma experiments

    DOE PAGES

    Valdivia, M. P.; Stutman, D.; Stoeckl, C.; ...

    2016-04-21

    Talbot-Lau X-ray Deflectometry has been developed as an electron density diagnostic for High Energy Density plasmas. The technique can deliver x-ray refraction, attenuation, elemental composition, and scatter information from a single Moiré image. An 8 keV Talbot-Lau interferometer was deployed using laser and x-pinch backlighters. Grating survival and electron density mapping was demonstrated for 25-29 J, 8-30 ps laser pulses using copper foil targets. Moire pattern formation and grating survival was also observed using a copper x-pinch driven at 400 kA, ~1 kA/ns. Lastly, these results demonstrate the potential of TXD as an electron density diagnostic for HED plasmas.

  5. The reliability of diagnostic coding and laboratory data to identify tuberculosis and nontuberculous mycobacterial disease among rheumatoid arthritis patients using anti-tumor necrosis factor therapy.

    PubMed

    Winthrop, Kevin L; Baxter, Roger; Liu, Liyan; McFarland, Bentson; Austin, Donald; Varley, Cara; Radcliffe, LeAnn; Suhler, Eric; Choi, Dongsoek; Herrinton, Lisa J

    2011-03-01

    Anti-tumor necrosis factor-alpha (anti-TNF) therapies are associated with severe mycobacterial infections in rheumatoid arthritis patients. We developed and validated electronic record search algorithms for these serious infections. The study used electronic clinical, microbiologic, and pharmacy records from Kaiser Permanente Northern California (KPNC) and the Portland Veterans Affairs Medical Center (PVAMC). We identified suspect tuberculosis and nontuberculous mycobacteria (NTM) cases using inpatient and outpatient diagnostic codes, culture results, and anti-tuberculous medication dispensing. We manually reviewed records to validate our case-finding algorithms. We identified 64 tuberculosis and 367 NTM potential cases, respectively. For tuberculosis, diagnostic code positive predictive value (PPV) was 54% at KPNC and 9% at PVAMC. Adding medication dispensings improved these to 87% and 46%, respectively. Positive tuberculosis cultures had a PPV of 100% with sensitivities of 79% (KPNC) and 55% (PVAMC). For NTM, the PPV of diagnostic codes was 91% (KPNC) and 76% (PVAMC). At KPNC, ≥ 1 positive NTM culture was sensitive (100%) and specific (PPV, 74%) if non-pathogenic species were excluded; at PVAMC, ≥1 positive NTM culture identified 76% of cases with PPV of 41%. Application of the American Thoracic Society NTM microbiology criteria yielded the highest PPV (100% KPNC, 78% PVAMC). The sensitivity and predictive value of electronic microbiologic data for tuberculosis and NTM infections is generally high, but varies with different facilities or models of care. Unlike NTM, tuberculosis diagnostic codes have poor PPV, and in the absence of laboratory data, should be combined with anti-tuberculous therapy dispensings for pharmacoepidemiologic research. Copyright © 2010 John Wiley & Sons, Ltd.

  6. Investigation of ion and electron heat transport of high- T e ECH heated discharges in the large helical device

    DOE PAGES

    Pablant, N. A.; Satake, S.; Yokoyama, M.; ...

    2016-01-28

    An analysis of the radial electric field and heat transport, both for ions and electrons, is presented for a high-more » $${{T}_{\\text{e}}}$$ electron cyclotron heated (ECH) discharge on the large helical device (LHD). Transport analysis is done using the task3d transport suite utilizing experimentally measured profiles for both ions and electrons. Ion temperature and perpendicular flow profiles are measured using the recently installed x-ray imaging crystal spectrometer diagnostic (XICS), while electron temperature and density profiles are measured using Thomson scattering. The analysis also includes calculated ECH power deposition profiles as determined through the travis ray-tracing code. This is the first time on LHD that this type of integrated transport analysis with measured ion temperature profiles has been performed without NBI, allowing the heat transport properties of plasmas with only ECH heating to be more clearly examined. For this study, a plasma discharge is chosen which develops a high central electron temperature ($${{T}_{\\text{eo}}}=9$$ keV) at moderately low densities ($${{n}_{\\text{eo}}}=1.5\\times {{10}^{19}}$$ m-3). The experimentally determined transport properties from task3d are compared to neoclassical predictions as calculated by the gsrake and fortec-3d codes. The predicted electron fluxes are seen to be an order of magnitude less than the measured fluxes, indicating that electron transport is largely anomalous, while the neoclassical and measured ion heat fluxes are of the same magnitude. Neoclassical predictions of a strong positive ambipolar electric field ($${{E}_{\\text{r}}}$$ ) in the plasma core are validated through comparisons to perpendicular flow measurements from the XICS diagnostic. Furthermore, this provides confidence that the predictions are producing physically meaningful results for the particle fluxes and radial electric field, which are a key component in correctly predicting plasma confinement.« less

  7. Synthetic diagnostics platform for fusion plasmas (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, L., E-mail: lshi@pppl.gov; Valeo, E. J.; Tobias, B. J.

    A Synthetic Diagnostics Platform (SDP) for fusion plasmas has been developed which provides state of the art synthetic reflectometry, beam emission spectroscopy, and Electron Cyclotron Emission (ECE) diagnostics. Interfaces to the plasma simulation codes GTC, XGC-1, GTS, and M3D-C{sup 1} are provided, enabling detailed validation of these codes. In this paper, we give an overview of SDP’s capabilities, and introduce the synthetic diagnostic modules. A recently developed synthetic ECE Imaging module which self-consistently includes refraction, diffraction, emission, and absorption effects is discussed in detail. Its capabilities are demonstrated on two model plasmas. The importance of synthetic diagnostics in validation ismore » shown by applying the SDP to M3D-C{sup 1} output and comparing it with measurements from an edge harmonic oscillation mode on DIII-D.« less

  8. Synthetic diagnostics platform for fusion plasmas (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, L.; Valeo, E. J.; Tobias, B. J.

    A Synthetic Diagnostics Platform (SDP) for fusion plasmas has been developed which provides state of the art synthetic reflectometry, beam emission spectroscopy, and Electron Cyclotron Emission (ECE) diagnostics. Interfaces to the plasma simulation codes GTC, XGC-1, GTS, and M3D-C-1 are provided, enabling detailed validation of these codes. In this paper, we give an overview of SDP's capabilities, and introduce the synthetic diagnostic modules. A recently developed synthetic ECE Imaging module which self-consistently includes refraction, diffraction, emission, and absorption effects is discussed in detail. Its capabilities are demonstrated on two model plasmas. Finally, the importance of synthetic diagnostics in validation ismore » shown by applying the SDP to M3D-C 1 output and comparing it with measurements from an edge harmonic oscillation mode on DIII-D.« less

  9. Synthetic diagnostics platform for fusion plasmas (invited)

    DOE PAGES

    Shi, L.; Valeo, E. J.; Tobias, B. J.; ...

    2016-08-26

    A Synthetic Diagnostics Platform (SDP) for fusion plasmas has been developed which provides state of the art synthetic reflectometry, beam emission spectroscopy, and Electron Cyclotron Emission (ECE) diagnostics. Interfaces to the plasma simulation codes GTC, XGC-1, GTS, and M3D-C-1 are provided, enabling detailed validation of these codes. In this paper, we give an overview of SDP's capabilities, and introduce the synthetic diagnostic modules. A recently developed synthetic ECE Imaging module which self-consistently includes refraction, diffraction, emission, and absorption effects is discussed in detail. Its capabilities are demonstrated on two model plasmas. Finally, the importance of synthetic diagnostics in validation ismore » shown by applying the SDP to M3D-C 1 output and comparing it with measurements from an edge harmonic oscillation mode on DIII-D.« less

  10. Validated methods for identifying tuberculosis patients in health administrative databases: systematic review.

    PubMed

    Ronald, L A; Ling, D I; FitzGerald, J M; Schwartzman, K; Bartlett-Esquilant, G; Boivin, J-F; Benedetti, A; Menzies, D

    2017-05-01

    An increasing number of studies are using health administrative databases for tuberculosis (TB) research. However, there are limitations to using such databases for identifying patients with TB. To summarise validated methods for identifying TB in health administrative databases. We conducted a systematic literature search in two databases (Ovid Medline and Embase, January 1980-January 2016). We limited the search to diagnostic accuracy studies assessing algorithms derived from drug prescription, International Classification of Diseases (ICD) diagnostic code and/or laboratory data for identifying patients with TB in health administrative databases. The search identified 2413 unique citations. Of the 40 full-text articles reviewed, we included 14 in our review. Algorithms and diagnostic accuracy outcomes to identify TB varied widely across studies, with positive predictive value ranging from 1.3% to 100% and sensitivity ranging from 20% to 100%. Diagnostic accuracy measures of algorithms using out-patient, in-patient and/or laboratory data to identify patients with TB in health administrative databases vary widely across studies. Use solely of ICD diagnostic codes to identify TB, particularly when using out-patient records, is likely to lead to incorrect estimates of case numbers, given the current limitations of ICD systems in coding TB.

  11. Method of Error Floor Mitigation in Low-Density Parity-Check Codes

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon (Inventor)

    2014-01-01

    A digital communication decoding method for low-density parity-check coded messages. The decoding method decodes the low-density parity-check coded messages within a bipartite graph having check nodes and variable nodes. Messages from check nodes are partially hard limited, so that every message which would otherwise have a magnitude at or above a certain level is re-assigned to a maximum magnitude.

  12. Glaucoma Diagnostic Ability of the Optical Coherence Tomography Angiography Vessel Density Parameters.

    PubMed

    Chung, Jae Keun; Hwang, Young Hoon; Wi, Jae Min; Kim, Mijin; Jung, Jong Jin

    2017-11-01

    To investigate the glaucoma diagnostic abilities of vessel density parameters as determined by optical coherence tomography (OCT) angiography in different stages of glaucoma. A total of 113 healthy eyes and 140 glaucomatous eyes were enrolled. Diagnostic abilities of the OCT vessel density parameters in the optic nerve head (ONH), peripapillary, and macular regions were evaluated by calculating the area under the receiver operation characteristic curves (AUCs). AUCs of the peripapillary vessel density parameters and circumpapillary retinal nerve fiber layer (RNFL) thickness were compared. OCT angiography vessel densities in the ONH, peripapillary, and macular regions in the glaucomatous eyes were significantly lower than those in the healthy eyes (P < 0.05). Among the vessel density parameters, the average peripapillary vessel density showed higher AUC than the ONH and macular region (AUCs: 0.807, 0.566, and 0.651, respectively) for glaucoma detection. The peripapillary vessel density parameters showed similar AUCs with the corresponding sectoral RNFL thickness (P > 0.05). However, in the early stage of glaucoma, the AUCs of the inferotemporal and temporal peripapillary vessel densities were significantly lower than that of the RNFL thickness (P < 0.05). The glaucomatous eyes showed decreased vessel density as determined by OCT angiography. Although the peripapillary vessel density parameters showed similar glaucoma diagnostic ability with circumpapillary RNFL thickness, in the early stage, the vessel density parameters showed limited clinical value.

  13. Protecting Against Damage from Refraction of High Power Microwaves in the DIII-D Tokamak

    NASA Astrophysics Data System (ADS)

    Lohr, John; Brambila, Rigo; Cengher, Mirela; Chen, Xi; Gorelov, Yuri; Grosnickle, William; Moeller, Charles; Ponce, Dan; Prater, Ron; Torrezan, Antonio; Austin, Max; Doyle, Edward; Hu, Xing; Dormier, Calvin

    2017-07-01

    Several new protective systems are being installed on the DIII D tokamak to increase the safety margins for plasma operations with injected ECH power at densities approaching cutoff. Inadvertent overdense operation has previously resulted in reflection of an rf beam back into a launcher causing extensive arcing and melt damage on one waveguide line. Damage to microwave diagnostics, which are located on the same side of the tokamak as the ECH launchers, also has occurred. Developing a reliable microwave based interlock to protect the many vulnerable systems in DIII-D has proved to be difficult. Therefore, multiple protective steps have been taken to reduce the risk of damage in the future. Among these is a density interlock generated by the plasma control system, with setpoint determined by the ECH operators based on rf beam trajectories and plasma parameters. Also installed are enhanced video monitoring of the launchers, and an ambient light monitor on each of the waveguide systems, along with a Langmuir probe at the mouth of each launcher. Versatile rf monitors, measuring forward and reflected power in addition to the mode content of the rf beams, have been installed as the last miter bends in each waveguide line. As these systems are characterized, they are being incorporated in the interlock chains, which enable the ECH injection permits. The diagnostics most susceptible to damage from the ECH waves have also been fitted with a variety of protective devices including stripline filters, thin resonant notch filters tuned to the 110 GHz injected microwave frequency, blazed grating filters and shutters. Calculations of rf beam trajectories in the plasmas are performed using the TORAY ray tracing code with input from kinetic profile diagnostics. Using these calculations, strike points for refracted beams on the vacuum vessel are calculated, which allows evaluation of the risk of damage to sensitive diagnostics and hardware.

  14. Protecting against damage from refraction of high power microwaves in the DIII-D tokamak

    DOE PAGES

    Lohr, John; Brambila, Rigo; Cengher, Mirela; ...

    2017-07-24

    Here, several new protective systems are being installed on the DIII D tokamak to increase the safety margins for plasma operations with injected ECH power at densities approaching cutoff. Inadvertent overdense operation has previously resulted in reflection of an rf beam back into a launcher causing extensive arcing and melt damage on one waveguide line. Damage to microwave diagnostics, which are located on the same side of the tokamak as the ECH launchers, also has occurred. Developing a reliable microwave based interlock to protect the many vulnerable systems in DIII-D has proved to be difficult. Therefore, multiple protective steps havemore » been taken to reduce the risk of damage in the future. Among these is a density interlock generated by the plasma control system, with setpoint determined by the ECH operators based on rf beam trajectories and plasma parameters. Also installed are enhanced video monitoring of the launchers, and an ambient light monitor on each of the waveguide systems, along with a Langmuir probe at the mouth of each launcher. Versatile rf monitors, measuring forward and reflected power in addition to the mode content of the rf beams, have been installed as the last miter bends in each waveguide line. As these systems are characterized, they are being incorporated in the interlock chains, which enable the ECH injection permits. The diagnostics most susceptible to damage from the ECH waves have also been fitted with a variety of protective devices including stripline filters, thin resonant notch filters tuned to the 110 GHz injected microwave frequency, blazed grating filters and shutters. Calculations of rf beam trajectories in the plasmas are performed using the TORAY ray tracing code with input from kinetic profile diagnostics. Using these calculations, strike points for refracted beams on the vacuum vessel are calculated, which allows evaluation of the risk of damage to sensitive diagnostics and hardware.« less

  15. Protecting against damage from refraction of high power microwaves in the DIII-D tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lohr, John; Brambila, Rigo; Cengher, Mirela

    Here, several new protective systems are being installed on the DIII D tokamak to increase the safety margins for plasma operations with injected ECH power at densities approaching cutoff. Inadvertent overdense operation has previously resulted in reflection of an rf beam back into a launcher causing extensive arcing and melt damage on one waveguide line. Damage to microwave diagnostics, which are located on the same side of the tokamak as the ECH launchers, also has occurred. Developing a reliable microwave based interlock to protect the many vulnerable systems in DIII-D has proved to be difficult. Therefore, multiple protective steps havemore » been taken to reduce the risk of damage in the future. Among these is a density interlock generated by the plasma control system, with setpoint determined by the ECH operators based on rf beam trajectories and plasma parameters. Also installed are enhanced video monitoring of the launchers, and an ambient light monitor on each of the waveguide systems, along with a Langmuir probe at the mouth of each launcher. Versatile rf monitors, measuring forward and reflected power in addition to the mode content of the rf beams, have been installed as the last miter bends in each waveguide line. As these systems are characterized, they are being incorporated in the interlock chains, which enable the ECH injection permits. The diagnostics most susceptible to damage from the ECH waves have also been fitted with a variety of protective devices including stripline filters, thin resonant notch filters tuned to the 110 GHz injected microwave frequency, blazed grating filters and shutters. Calculations of rf beam trajectories in the plasmas are performed using the TORAY ray tracing code with input from kinetic profile diagnostics. Using these calculations, strike points for refracted beams on the vacuum vessel are calculated, which allows evaluation of the risk of damage to sensitive diagnostics and hardware.« less

  16. Bootstrap imputation with a disease probability model minimized bias from misclassification due to administrative database codes.

    PubMed

    van Walraven, Carl

    2017-04-01

    Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Talbot-Lau x-ray interferometry for high energy density plasma diagnostic.

    PubMed

    Stutman, D; Finkenthal, M

    2011-11-01

    High resolution density diagnostics are difficult in high energy density laboratory plasmas (HEDLP) experiments due to the scarcity of probes that can penetrate above solid density plasmas. Hard x-rays are one possible probe for such dense plasmas. We study the possibility of applying an x-ray method recently developed for medical imaging, differential phase-contrast with Talbot-Lau interferometers, for the diagnostic of electron density and small-scale hydrodynamic instabilities in HEDLP experiments. The Talbot method uses micro-periodic gratings to measure the refraction and ultra-small angle scatter of x-rays through an object and is attractive for HEDLP diagnostic due to its capability to work with incoherent and polychromatic x-ray sources such as the laser driven backlighters used for HEDLP radiography. Our paper studies the potential of the Talbot method for HEDLP diagnostic, its adaptation to the HEDLP environment, and its extension of high x-ray energy using micro-periodic mirrors. The analysis is illustrated with experimental results obtained using a laboratory Talbot interferometer. © 2011 American Institute of Physics

  18. X-ray lines as a density diagnostic in DT plasmas near 100x solid density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, D.S.

    1977-10-19

    The use of electron impact broadened resonance lines to diagnose near-term high density diagnostics is discussed. In particular, the question of how to choose seed and pusher materials to have discernible broadening effects while maintaining line visibility is discussed.

  19. High bandwidth vapor density diagnostic system

    DOEpatents

    Globig, Michael A.; Story, Thomas W.

    1992-01-01

    A high bandwidth vapor density diagnostic system for measuring the density of an atomic vapor during one or more photoionization events. The system translates the measurements from a low frequency region to a high frequency, relatively noise-free region in the spectrum to provide improved signal to noise ratio.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scime, Earl E.

    The magnitude and spatial dependence of neutral density in magnetic confinement fusion experiments is a key physical parameter, particularly in the plasma edge. Modeling codes require precise measurements of the neutral density to calculate charge-exchange power losses and drag forces on rotating plasmas. However, direct measurements of the neutral density are problematic. In this work, we proposed to construct a laser-based diagnostic capable of providing spatially resolved measurements of the neutral density in the edge of plasma in the DIII-D tokamak. The diagnostic concept is based on two-photon absorption laser induced fluorescence (TALIF). By injecting two beams of 205 nmmore » light (co or counter propagating), ground state hydrogen (or deuterium or tritium) can be excited from the n = 1 level to the n = 3 level at the location where the two beams intersect. Individually, the beams experience no absorption, and therefore have no difficulty penetrating even dense plasmas. After excitation, a fraction of the hydrogen atoms decay from the n = 3 level to the n = 2 level and emit photons at 656 nm (the H α line). Calculations based on the results of previous TALIF experiments in magnetic fusion devices indicated that a laser pulse energy of approximately 3 mJ delivered in 5 ns would provide sufficient signal-to-noise for detection of the fluorescence. In collaboration with the DIII-D engineering staff and experts in plasma edge diagnostics for DIII-D from Oak Ridge National Laboratory (ORNL), WVU researchers designed a TALIF system capable of providing spatially resolved measurements of neutral deuterium densities in the DIII-D edge plasma. The laser systems were specified, purchased, and assembled at WVU. The TALIF system was tested on a low-power hydrogen discharge at WVU and the plan was to move the instrument to DIII-D for installation in collaboration with ORNL researchers. After budget cuts at DIII-D, the DIII-D facility declined to support installation on their tokamak. Instead, after a no-cost extension, the apparatus was moved to the University of Washington-Seattle and successfully tested on the HIT-SI3 spheromak experiment. As a result of this project, TALIF measurements of the absolutely calibrated neutral density hydrogen and deuterium were obtained in a helicon source and in a spheromak, designs were developed for installation of a TALIF system on a tokamak, and a new, xenon-based calibration scheme was proposed and demonstrated. The xenon-calibration scheme eliminates significant problems that were identified with the standard krypton calibration scheme.« less

  1. Photoionization and High Density Gas

    NASA Technical Reports Server (NTRS)

    Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.

  2. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  3. Medical Surveillance Monthly Report (MSMR). Volume 17, Number 08, August 2010

    DTIC Science & Technology

    2010-08-01

    notifi able medical event reports that included diagnostic codes (ICD-9-CM) indicative of chlamydia, gonorrhea, syphilis, herpes simplex virus (HSV...infections of interest for this report Results: Condition Diagnostic codes Chlamydia 099.41, 099.5 Gonorrhea 098 Herpes simplex (HSV) 054 Human...housing arrangements may also play roles and off er opportunities for targeted prevention.6 Human papillomavirus (HPV), the cause of genital warts

  4. Diagnostic labels assigned to patients with orthopedic conditions and the influence of the label on selection of interventions: a qualitative study of orthopaedic clinical specialists.

    PubMed

    Miller-Spoto, Marcia; Gombatto, Sara P

    2014-06-01

    A variety of diagnostic classification systems are used by physical therapists, but little information about how therapists assign diagnostic labels and how the labels are used to direct intervention is available. The purposes of this study were: (1) to examine the diagnostic labels assigned to patient problems by physical therapists who are board-certified Orthopaedic Clinical Specialists (OCSs) and (2) to determine whether the label influences selection of interventions. A cross-sectional survey was conducted. Two written cases were developed for patients with low back and shoulder pain. A survey was used to evaluate the diagnostic label assigned and the interventions considered important for each case. The cases and survey were sent to therapists who are board-certified OCSs. Respondents assigned a diagnostic label and rated the importance of intervention categories for each case. Each diagnostic label was coded based on the construct it represented. Percentage responses for each diagnostic label code and intervention category were calculated. Relative importance of intervention category based on diagnostic label was examined. For the low back pain and shoulder pain cases, respectively, "Combination" (48.5%, 34.9%) and "Pathology/Pathophysiology" (32.7%, 57.3%) diagnostic labels were most common. Strengthening (85.9%, 98.1%), stretching (86.8%, 84.9%), neuromuscular re-education (87.6%, 93.4%), functional training (91.4%, 88.6%), and mobilization/manipulation (85.1%, 86.8%) were considered the most important interventions. Relative importance of interventions did not differ based on diagnostic label (χ2=0.050-1.263, P=.261-.824). The low response rate may limit the generalizability of the findings. Also, examples provided for labels may have influenced responses, and some of the label codes may have represented overlapping constructs. There is little consistency with which OCS therapists assign diagnostic labels, and the label does not seem to influence selection of interventions. © 2014 American Physical Therapy Association.

  5. Molecular Diagnostics of the Internal Motions of Massive Cores

    NASA Astrophysics Data System (ADS)

    Pineda, Jorge; Velusamy, T.; Goldsmith, P.; Li, D.; Peng, R.; Langer, W.

    2009-12-01

    We present models of the internal kinematics of massive cores in the Orion molecular cloud. We use a sample of cores studied by Velusamy et al. (2008) that show red, blue, and no asymmetry in their HCO+ line profiles in equal proportion, and which therefore may represent a sample of cores in different kinematic states. We use the radiative transfer code RATRAN (Hogerheijde & van der Tak 2000) to model several transitions of HCO+ and H13CO+ as well as the dust continuum emission, of a spherical model cloud with radial density, temperature, and velocity gradients. We find that an excitation and velocity gradients are prerequisites to reproduce the observed line profiles. We use the dust continuum emission to constrain the density and temperature gradients. This allows us to narrow down the functional forms of the velocity gradient giving us the opportunity to test several theoretical predictions of velocity gradients produced by the effect of magnetic fields (e.g. Tassis et. al. 2007) and turbulence (e.g. Vasquez-Semanedi et al 2007).

  6. Studies of supersonic, radiative plasma jet interaction with gases at the Prague Asterix Laser System facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicolaie, Ph.; Stenz, C.; Tikhonchuk, V.

    2008-08-15

    The interaction of laser driven jets with gas puffs at various pressures is investigated experimentally and is analyzed by means of numerical tools. In the experiment, a combination of two complementary diagnostics allowed to characterize the main structures in the interaction zone. By changing the gas composition and its density, the plasma cooling time can be controlled and one can pass from a quasiadiabatic outflow to a strongly radiation cooling jet. This tuning yields hydrodynamic structures very similar to those seen in astrophysical objects; the bow shock propagating through the gas, the shocked materials, the contact discontinuity, and the Machmore » disk. From a dimensional analysis, a scaling is made between both systems and shows the study relevance for the jet velocity, the Mach number, the jet-gas density ratio, and the dissipative processes. The use of a two-dimensional radiation hydrodynamic code, confirms the previous analysis and provides detailed structure of the interaction zone and energy repartition between jet and surrounding gases.« less

  7. Object-oriented controlled-vocabulary translator using TRANSOFT + HyperPAD.

    PubMed

    Moore, G W; Berman, J J

    1991-01-01

    Automated coding of surgical pathology reports is demonstrated. This public-domain translation software operates on surgical pathology files, extracting diagnoses and assigning codes in a controlled medical vocabulary, such as SNOMED. Context-sensitive translation algorithms are employed, and syntactically correct diagnostic items are produced that are matched with controlled vocabulary. English-language surgical pathology reports, accessioned over one year at the Baltimore Veterans Affairs Medical Center, were translated. With an interface to a larger hospital information system, all natural language pathology reports are automatically rendered as topography and morphology codes. This translator frees the pathologist from the time-intensive task of personally coding each report, and may be used to flag certain diagnostic categories that require specific quality assurance actions.

  8. Object-oriented controlled-vocabulary translator using TRANSOFT + HyperPAD.

    PubMed Central

    Moore, G. W.; Berman, J. J.

    1991-01-01

    Automated coding of surgical pathology reports is demonstrated. This public-domain translation software operates on surgical pathology files, extracting diagnoses and assigning codes in a controlled medical vocabulary, such as SNOMED. Context-sensitive translation algorithms are employed, and syntactically correct diagnostic items are produced that are matched with controlled vocabulary. English-language surgical pathology reports, accessioned over one year at the Baltimore Veterans Affairs Medical Center, were translated. With an interface to a larger hospital information system, all natural language pathology reports are automatically rendered as topography and morphology codes. This translator frees the pathologist from the time-intensive task of personally coding each report, and may be used to flag certain diagnostic categories that require specific quality assurance actions. PMID:1807773

  9. Iterative decoding of SOVA and LDPC product code for bit-patterned media recoding

    NASA Astrophysics Data System (ADS)

    Jeong, Seongkwon; Lee, Jaejin

    2018-05-01

    The demand for high-density storage systems has increased due to the exponential growth of data. Bit-patterned media recording (BPMR) is one of the promising technologies to achieve the density of 1Tbit/in2 and higher. To increase the areal density in BPMR, the spacing between islands needs to be reduced, yet this aggravates inter-symbol interference and inter-track interference and degrades the bit error rate performance. In this paper, we propose a decision feedback scheme using low-density parity check (LDPC) product code for BPMR. This scheme can improve the decoding performance using an iterative approach with extrinsic information and log-likelihood ratio value between iterative soft output Viterbi algorithm and LDPC product code. Simulation results show that the proposed LDPC product code can offer 1.8dB and 2.3dB gains over the one LDPC code at the density of 2.5 and 3 Tb/in2, respectively, when bit error rate is 10-6.

  10. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Emission diagnostics-general requirements. 1033.110 Section 1033.110 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a...

  11. Talbot-Lau x-ray deflectometer electron density diagnostic for laser and pulsed power high energy density plasma experiments (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdivia, M. P., E-mail: mpvaldivia@pha.jhu.edu; Stutman, D.; Stoeckl, C.

    2016-11-15

    Talbot-Lau X-ray deflectometry (TXD) has been developed as an electron density diagnostic for High Energy Density (HED) plasmas. The technique can deliver x-ray refraction, attenuation, elemental composition, and scatter information from a single Moiré image. An 8 keV Talbot-Lau interferometer was deployed using laser and x-pinch backlighters. Grating survival and electron density mapping were demonstrated for 25–29 J, 8–30 ps laser pulses using copper foil targets. Moiré pattern formation and grating survival were also observed using a copper x-pinch driven at 400 kA, ∼1 kA/ns. These results demonstrate the potential of TXD as an electron density diagnostic for HED plasmas.

  12. Diagnostic Concordance between DSM-5 and ICD-10 Cannabis Use Disorders.

    PubMed

    Proctor, Steven L; Williams, Daniel C; Kopak, Albert M; Voluse, Andrew C; Connolly, Kevin M; Hoffmann, Norman G

    2016-07-01

    With the recent federal mandate that all U.S. health care settings transition to ICD-10 billing codes, empirical evidence is necessary to determine if the DSM-5 designations map to their respective ICD-10 diagnostic categories/billing codes. The present study examined the concordance between DSM-5 and ICD-10 cannabis use disorder diagnoses. Data were derived from routine clinical assessments of 6871 male and 801 female inmates recently admitted to a state prison system from 2000 to 2003. DSM-5 and ICD-10 diagnostic determinations were made from algorithms corresponding to the respective diagnostic formulations. Past 12-month prevalence rates of cannabis use disorders were comparable across classification systems. The vast majority of inmates with no DSM-5 diagnosis continued to have no diagnosis per the ICD-10, and a similar proportion with a DSM-5 severe diagnosis received an ICD-10 dependence diagnosis. Most of the variation in diagnostic classifications was accounted for by those with a DSM-5 moderate diagnosis in that approximately half of these cases received an ICD-10 dependence diagnosis while the remaining cases received a harmful use diagnosis. Although there appears to be a generally high level of agreement between diagnostic classification systems for those with no diagnosis or those evincing symptoms of a more severe condition, concordance between DSM-5 moderate and ICD-10 dependence diagnoses was poor. Additional research is warranted to determine the appropriateness and implications of the current DSM-5 coding guidelines regarding the assignment of an ICD-10 dependence code for those with a DSM-5 moderate diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A study of tungsten spectra using large helical device and compact electron beam ion trap in NIFS

    NASA Astrophysics Data System (ADS)

    Morita, S.; Dong, C. F.; Goto, M.; Kato, D.; Murakami, I.; Sakaue, H. A.; Hasuo, M.; Koike, F.; Nakamura, N.; Oishi, T.; Sasaki, A.; Wang, E. H.

    2013-07-01

    Tungsten spectra have been observed from Large Helical Device (LHD) and Compact electron Beam Ion Trap (CoBIT) in wavelength ranges of visible to EUV. The EUV spectra with unresolved transition array (UTA), e.g., 6g-4f, 5g-4f, 5f-4d and 5p-4d transitions for W+24-+33, measured from LHD plasmas are compared with those measured from CoBIT with monoenergetic electron beam (≤2keV). The tungsten spectra from LHD are well analyzed based on the knowledge from CoBIT tungsten spectra. The C-R model code has been developed to explain the UTA spectra in details. Radial profiles of EUV spectra from highly ionized tungsten ions have been measured and analyzed by impurity transport simulation code with ADPAK atomic database code to examine the ionization balance determined by ionization and recombination rate coefficients. As the first trial, analysis of the tungsten density in LHD plasmas is attempted from radial profile of Zn-like WXLV (W44+) 4p-4s transition at 60.9Å based on the emission rate coefficient calculated with HULLAC code. As a result, a total tungsten ion density of 3.5×1010cm-3 at the plasma center is reasonably obtained. In order to observe the spectra from tungsten ions in lower-ionized charge stages, which can give useful information on the tungsten influx in fusion plasmas, the ablation cloud of the impurity pellet is directly measured with visible spectroscopy. A lot of spectra from neutral and singly ionized tungsten are observed and some of them are identified. A magnetic forbidden line from highly ionized tungsten ions has been examined and Cd-like WXXVII (W26+) at 3893.7Å is identified as the ground-term fine-structure transition of 4f23H5-3H4. The possibility of α particle diagnostic in D-T burning plasmas using the magnetic forbidden line is discussed.

  14. The Study of Cardiovascular Health Outcomes in the Era of Claims Data: The Cardiovascular Health Study

    PubMed Central

    Psaty, Bruce M; Delaney, Joseph A; Arnold, Alice M; Curtis, Lesley H; Fitzpatrick, Annette L; Heckbert, Susan R; McKnight, Barbara; Ives, Diane; Gottdiener, John S; Kuller, Lewis H; Longstreth, W T

    2015-01-01

    Background Increasingly, the diagnostic codes from administrative claims data are being used as clinical outcomes. Methods and Results Data from the Cardiovascular Health Study (CHS) were used to compare event rates and risk-factor associations between adjudicated hospitalized cardiovascular events and claims-based methods of defining events. The outcomes of myocardial infarction (MI), stroke, and heart failure (HF) were defined in three ways: 1) the CHS adjudicated event (CHS[adj]); 2) selected ICD9 diagnostic codes only in the primary position for Medicare claims data from the Center for Medicare and Medicaid Services (CMS[1st]); and 3) the same selected diagnostic codes in any position (CMS[any]). Conventional claims-based methods of defining events had high positive predictive values (PPVs) but low sensitivities. For instance, the PPV of an ICD9 code of 410.×1 for a new acute MI in the first position was 90.6%, but this code identified only 53.8% of incident MIs. The observed event rates were low. For MI, the incidence was 14.9 events per 1000 person years for CHS[adj] MI, 8.6 for CMS[1st] and 12.2 for CMS[any]. In general, CVD risk factor associations were similar across the three methods of defining events. Indeed, traditional CVD risk factors were also associated with all first hospitalizations not due to an MI. Conclusions The use of diagnostic codes from claims data as clinical events, especially when restricted to primary diagnoses, leads to an underestimation of event rates. Additionally, claims-based events data represent a composite endpoint that includes the outcome of interest and selected (misclassified) non-event hospitalizations. PMID:26538580

  15. Study of Cardiovascular Health Outcomes in the Era of Claims Data: The Cardiovascular Health Study.

    PubMed

    Psaty, Bruce M; Delaney, Joseph A; Arnold, Alice M; Curtis, Lesley H; Fitzpatrick, Annette L; Heckbert, Susan R; McKnight, Barbara; Ives, Diane; Gottdiener, John S; Kuller, Lewis H; Longstreth, W T

    2016-01-12

    Increasingly, the diagnostic codes from administrative claims data are being used as clinical outcomes. Data from the Cardiovascular Health Study (CHS) were used to compare event rates and risk factor associations between adjudicated hospitalized cardiovascular events and claims-based methods of defining events. The outcomes of myocardial infarction (MI), stroke, and heart failure were defined in 3 ways: the CHS adjudicated event (CHS[adj]), selected International Classification of Diseases, Ninth Edition diagnostic codes only in the primary position for Medicare claims data from the Center for Medicare & Medicaid Services (CMS[1st]), and the same selected diagnostic codes in any position (CMS[any]). Conventional claims-based methods of defining events had high positive predictive values but low sensitivities. For instance, the positive predictive value of International Classification of Diseases, Ninth Edition code 410.x1 for a new acute MI in the first position was 90.6%, but this code identified only 53.8% of incident MIs. The observed event rates for CMS[1st] were low. For MI, the incidence was 14.9 events per 1000 person-years for CHS[adj] MI, 8.6 for CMS[1st] MI, and 12.2 for CMS[any] MI. In general, cardiovascular disease risk factor associations were similar across the 3 methods of defining events. Indeed, traditional cardiovascular disease risk factors were also associated with all first hospitalizations not resulting from an MI. The use of diagnostic codes from claims data as clinical events, especially when restricted to primary diagnoses, leads to an underestimation of event rates. Additionally, claims-based events data represent a composite end point that includes the outcome of interest and selected (misclassified) nonevent hospitalizations. © 2015 American Heart Association, Inc.

  16. Assessment of Literature Related to Combustion Appliance Venting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rapp, V. H.; Less, B. D.; Singer, B. C.

    In many residential building retrofit programs, air tightening to increase energy efficiency is often constrained by safety concerns with naturally vented combustion appliances. Tighter residential buildings more readily depressurize when exhaust equipment is operated, making combustion appliances more prone to backdraft or spill combustion exhaust into the living space. Several measures, such as installation guidelines, vent sizing codes, and combustion safety diagnostics, are in place with the intent to prevent backdrafting and combustion spillage, but the diagnostics conflict and the risk mitigation objective is inconsistent. This literature review summarizes the metrics and diagnostics used to assess combustion safety, documents theirmore » technical basis, and investigates their risk mitigations. It compiles information from the following: codes for combustion appliance venting and installation; standards and guidelines for combustion safety diagnostics; research evaluating combustion safety diagnostics; research investigating wind effects on building depressurization and venting; and software for simulating vent system performance.« less

  17. Laser-plasma interaction experiments and diagnostics at NRL (Naval Research Laboratory). Memorandum report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripin, B.H.; Grun, J.; Herbst, M.J.

    Laser plasma interaction experiments have now advanced to the point where very quantitative measurements are required to elucidate the physic issues important for laser fusion and other applications. Detailed time-resolved knowledge of the plasma density, temperature, velocity gradients, spatial structure, heat flow characteristics, radiation emission, etc, are needed over tremendou ranges of plasma density and temperature. Moreover, the time scales are very short, aggrevating the difficulty of the measurements further. Nonetheless, such substantial progress has been made in diagnostic development during the past few years that we are now able to do well diagnosed experiments. In this paper the authorsmore » review recent diagnostic developments for laser-plasma interactions, outline their regimes of applicability, and show examples of their utility. In addition to diagnostics for the high densities and temperature characteristic of laser fusion physics studies, diagnostics designed to study the two-stream interactions of laser created plasma flowing through an ambient low density plasma will be described.« less

  18. Augmenting epidemiological models with point-of-care diagnostics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.

    Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less

  19. Augmenting epidemiological models with point-of-care diagnostics data

    DOE PAGES

    Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.; ...

    2016-04-20

    Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less

  20. Coding in Stroke and Other Cerebrovascular Diseases.

    PubMed

    Korb, Pearce J; Jones, William

    2017-02-01

    Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.

  1. Quantum Kronecker sum-product low-density parity-check codes with finite rate

    NASA Astrophysics Data System (ADS)

    Kovalev, Alexey A.; Pryadko, Leonid P.

    2013-07-01

    We introduce an ansatz for quantum codes which gives the hypergraph-product (generalized toric) codes by Tillich and Zémor and generalized bicycle codes by MacKay as limiting cases. The construction allows for both the lower and the upper bounds on the minimum distance; they scale as a square root of the block length. Many thus defined codes have a finite rate and limited-weight stabilizer generators, an analog of classical low-density parity-check (LDPC) codes. Compared to the hypergraph-product codes, hyperbicycle codes generally have a wider range of parameters; in particular, they can have a higher rate while preserving the estimated error threshold.

  2. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  3. Powerloads on the front end components and the duct of the heating and diagnostic neutral beam lines at ITER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, M. J.; Boilson, D.; Hemsworth, R. S.

    2015-04-08

    The heating and current drive beam lines (HNB) at ITER are expected to deliver ∼16.7 MW power per beam line for H beams at 870 keV and D beams at 1 MeV during the H-He and the DD/DT phases of ITER operation respectively. On the other hand the diagnostic neutral beam (DNB) line shall deliver ∼2 MW power for H beams at 100 keV during both the phases. The path lengths over which the beams from the HNB and DNB beam lines need to be transported are 25.6 m and 20.7 m respectively. The transport of the beams over these path lengths resultsmore » in beam losses, mainly by the direct interception of the beam with the beam line components and reionisation. The lost power is deposited on the surfaces of the various components of the beam line. In order to ensure the survival of these components over the operational life time of ITER, it is important to determine to the best possible extent the operational power loads and power densities on the various surfaces which are impacted by the beam in one way or the other during its transport. The main factors contributing to these are the divergence of the beamlets and the halo fraction in the beam, the beam aiming, the horizontal and vertical misalignment of the beam, and the gas profile along the beam path, which determines the re-ionisation loss, and the re-ionisation cross sections. The estimations have been made using a combination of the modified version of the Monte Carlo Gas Flow code (MCGF) and the BTR code. The MCGF is used to determine the gas profile in the beam line and takes into account the active gas feed into the ion source and neutraliser, the HNB-DNB cross over, the gas entering the beamline from the ITER machine, the additional gas atoms generated in the beam line due to impacting ions and the pumping speed of the cryopumps. The BTR code has been used to obtain the power loads and the power densities on the various surfaces of the front end components and the duct modules for different scenarios of ITER operation. The gas profile and the magnetic field distribution for each scenario has been considered in these evaluations. The worst case power loads and power densities for each surface have been used to study their thermo-mechanical behaviour and manufacturing feasibility. The details of these calculations and results obtained are presented and discussed.« less

  4. Validation of gyrokinetic simulations with measurements of electron temperature fluctuations and density-temperature phase angles on ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Freethy, S. J.; Görler, T.; Creely, A. J.; Conway, G. D.; Denk, S. S.; Happel, T.; Koenen, C.; Hennequin, P.; White, A. E.; ASDEX Upgrade Team

    2018-05-01

    Measurements of turbulent electron temperature fluctuation amplitudes, δTe ⊥/Te , frequency spectra, and radial correlation lengths, Lr(Te ⊥) , have been performed at ASDEX Upgrade using a newly upgraded Correlation ECE diagnostic in the range of scales k⊥<1.4 cm-1, kr<3.5 cm-1 ( k⊥ρs<0.28 and krρs<0.7 ). The phase angle between turbulent temperature and density fluctuations, αnT, has also been measured by using an ECE radiometer coupled to a reflectometer along the same line of sight. These quantities are used simultaneously to constrain a set of ion-scale non-linear gyrokinetic turbulence simulations of the outer core (ρtor = 0.75) of a low density, electron heated L-mode plasma, performed using the gyrokinetic simulation code, GENE. The ion and electron temperature gradients were scanned within uncertainties. It is found that gyrokinetic simulations are able to match simultaneously the electron and ion heat flux at this radius within the experimental uncertainties. The simulations were performed based on a reference discharge for which δTe ⊥/Te measurements were available, and Lr(Te ⊥) and αnT were then predicted using synthetic diagnostics prior to measurements in a repeat discharge. While temperature fluctuation amplitudes are overestimated by >50% for all simulations within the sensitivity scans performed, good quantitative agreement is found for Lr(Te ⊥) and αnT. A validation metric is used to quantify the level of agreement of individual simulations with experimental measurements, and the best agreement is found close to the experimental gradient values.

  5. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    PubMed

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  6. OEDGE modeling for the planned tungsten ring experiment on DIII-D

    DOE PAGES

    Elder, J. David; Stangeby, Peter C.; Abrams, Tyler W.; ...

    2017-04-19

    The OEDGE code is used to model tungsten erosion and transport for DIII-D experiments with toroidal rings of high-Z metal tiles. Such modeling is needed for both experimental and diagnostic design to have estimates of the expected core and edge tungsten density and to understand the various factors contributing to the uncertainties in these calculations. OEDGE simulations are performed using the planned experimental magnetic geometries and plasma conditions typical of both L-mode and inter-ELM H-mode discharges in DIII-D. OEDGE plasma reconstruction based on specific representative discharges for similar geometries is used to determine the plasma conditions applied to tungsten plasmamore » impurity simulations. We developed a new model for tungsten erosion in OEDGE which imports charge-state resolved carbon impurity fluxes and impact energies from a separate OEDGE run which models the carbon production, transport and deposition for the same plasma conditions as the tungsten simulations. Furthermore, these values are then used to calculate the gross tungsten physical sputtering due to carbon plasma impurities which is then added to any sputtering by deuterium ions; tungsten self-sputtering is also included. The code results are found to be dependent on the following factors: divertor geometry and closure, the choice of cross-field anomalous transport coefficients, divertor plasma conditions (affecting both tungsten source strength and transport), the choice of tungsten atomic physics data used in the model (in particular sviz(Te) for W-atoms), and the model of the carbon flux and energy used for 2 calculating the tungsten source due to sputtering. The core tungsten density is found to be of order 10 15 m -3 (excluding effects of any core transport barrier and with significant variability depending on the other factors mentioned) with density decaying into the scrape off layer.« less

  7. Development and application of a ray-tracing code integrating with 3D equilibrium mapping in LHD ECH experiments

    NASA Astrophysics Data System (ADS)

    Tsujimura, T., Ii; Kubo, S.; Takahashi, H.; Makino, R.; Seki, R.; Yoshimura, Y.; Igami, H.; Shimozuma, T.; Ida, K.; Suzuki, C.; Emoto, M.; Yokoyama, M.; Kobayashi, T.; Moon, C.; Nagaoka, K.; Osakabe, M.; Kobayashi, S.; Ito, S.; Mizuno, Y.; Okada, K.; Ejiri, A.; Mutoh, T.

    2015-11-01

    The central electron temperature has successfully reached up to 7.5 keV in large helical device (LHD) plasmas with a central high-ion temperature of 5 keV and a central electron density of 1.3× {{10}19} m-3. This result was obtained by heating with a newly-installed 154 GHz gyrotron and also the optimisation of injection geometry in electron cyclotron heating (ECH). The optimisation was carried out by using the ray-tracing code ‘LHDGauss’, which was upgraded to include the rapid post-processing three-dimensional (3D) equilibrium mapping obtained from experiments. For ray-tracing calculations, LHDGauss can automatically read the relevant data registered in the LHD database after a discharge, such as ECH injection settings (e.g. Gaussian beam parameters, target positions, polarisation and ECH power) and Thomson scattering diagnostic data along with the 3D equilibrium mapping data. The equilibrium map of the electron density and temperature profiles are then extrapolated into the region outside the last closed flux surface. Mode purity, or the ratio between the ordinary mode and the extraordinary mode, is obtained by calculating the 1D full-wave equation along the direction of the rays from the antenna to the absorption target point. Using the virtual magnetic flux surfaces, the effects of the modelled density profiles and the magnetic shear at the peripheral region with a given polarisation are taken into account. Power deposition profiles calculated for each Thomson scattering measurement timing are registered in the LHD database. The adjustment of the injection settings for the desired deposition profile from the feedback provided on a shot-by-shot basis resulted in an effective experimental procedure.

  8. Study regarding the density evolution of messages and the characteristic functions associated of a LDPC code

    NASA Astrophysics Data System (ADS)

    Drăghici, S.; Proştean, O.; Răduca, E.; Haţiegan, C.; Hălălae, I.; Pădureanu, I.; Nedeloni, M.; (Barboni Haţiegan, L.

    2017-01-01

    In this paper a method with which a set of characteristic functions are associated to a LDPC code is shown and also functions that represent the evolution density of messages that go along the edges of a Tanner graph. Graphic representations of the density evolution are shown respectively the study and simulation of likelihood threshold that render asymptotic boundaries between which there are decodable codes were made using MathCad V14 software.

  9. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  10. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  11. First measurements of Dα spectrum produced by anisotropic fast ions in the gas dynamic trap

    NASA Astrophysics Data System (ADS)

    Lizunov, A.; Anikeev, A.

    2014-11-01

    Angled injection of eight deuterium beams in gas dynamic trap (GDT) plasmas builds up the population of fast ions with the distribution function, which conserves a high degree of initial anisotropy in space, energy, and pitch angle. Unlike the Maxwellian distribution case, the fast ion plasma component in GDT cannot be exhaustively characterized by the temperature and density. The instrumentation complex to study of fast ions is comprised of motional Stark effect diagnostic, analyzers of charge exchange atoms, and others. The set of numerical codes using for equilibrium modeling is also an important tool of analysis. In the recent campaign of summer 2014, we recorded first signals from the new fast ion D-alpha diagnostic on GDT. This paper presents the diagnostic description and results of pilot measurements. The diagnostic has four lines of sight, distributed across the radius of an axially symmetric plasma column in GDT. In the present setup, a line-integrated optical signal is measured in each channel. In the transverse direction, the spatial resolution is 18 mm. Collected light comes to the grating spectrometer with the low-noise detector based on a charge-coupled device matrix. In the regime of four spectra stacked vertically on the sensor, the effective spectral resolution of measurements is approximately 0.015 nm. Exposure timing is provided by the fast optical ferroelectric crystal shutter, allowing frames of duration down to 70 μs. This number represents the time resolution of measurements. A large dynamic range of the camera permits for a measurement of relatively small light signals produced by fast ions on top of the bright background emission from the bulk plasma. The fast ion emission has a non-Gaussian spectrum featuring the characteristic width of approximately 4 nm, which can be separated from relatively narrow Gaussian lines of D-alpha and H-alpha coming from the plasma periphery, and diagnostic beam emission. The signal to noise ratio varies from approximately ten for the central channel to approximately five for the outermost channel. We used the special set of Monte Carlo codes to fit the measured spectra. The shape of model fit shows a good agreement with the experimental fast ion D-alpha spectrum.

  12. 2D imaging X-ray diagnostic for measuring the current density distribution in a wide-area electron beam produced in a multiaperture diode with plasma cathode

    NASA Astrophysics Data System (ADS)

    Kurkuchekov, V.; Kandaurov, I.; Trunev, Y.

    2018-05-01

    A simple and inexpensive X-ray diagnostic tool was designed for measuring the cross-sectional current density distribution in a low-relativistic pulsed electron beam produced in a source based on an arc-discharge plasma cathode and multiaperture diode-type electron optical system. The beam parameters were as follows: Uacc = 50–110 kV, Ibeam = 20–100 A, τbeam = 0.1–0.3 ms. The beam effective diameter was ca. 7 cm. Based on a pinhole camera, the diagnostic allows one to obtain a 2D profile of electron beam flux distribution on a flat metal target in a single shot. The linearity of the diagnostic system response to the electron flux density was established experimentally. Spatial resolution of the diagnostic was also estimated in special test experiments. The optimal choice of the main components of the diagnostic technique is discussed.

  13. Optimising the use of electronic health records to estimate the incidence of rheumatoid arthritis in primary care: what information is hidden in free text?

    PubMed Central

    2013-01-01

    Background Primary care databases are a major source of data for epidemiological and health services research. However, most studies are based on coded information, ignoring information stored in free text. Using the early presentation of rheumatoid arthritis (RA) as an exemplar, our objective was to estimate the extent of data hidden within free text, using a keyword search. Methods We examined the electronic health records (EHRs) of 6,387 patients from the UK, aged 30 years and older, with a first coded diagnosis of RA between 2005 and 2008. We listed indicators for RA which were present in coded format and ran keyword searches for similar information held in free text. The frequency of indicator code groups and keywords from one year before to 14 days after RA diagnosis were compared, and temporal relationships examined. Results One or more keyword for RA was found in the free text in 29% of patients prior to the RA diagnostic code. Keywords for inflammatory arthritis diagnoses were present for 14% of patients whereas only 11% had a diagnostic code. Codes for synovitis were found in 3% of patients, but keywords were identified in an additional 17%. In 13% of patients there was evidence of a positive rheumatoid factor test in text only, uncoded. No gender differences were found. Keywords generally occurred close in time to the coded diagnosis of rheumatoid arthritis. They were often found under codes indicating letters and communications. Conclusions Potential cases may be missed or wrongly dated when coded data alone are used to identify patients with RA, as diagnostic suspicions are frequently confined to text. The use of EHRs to create disease registers or assess quality of care will be misleading if free text information is not taken into account. Methods to facilitate the automated processing of text need to be developed and implemented. PMID:23964710

  14. Medical Resource Planning: The Need to Use a Standardized Diagnostic System

    DTIC Science & Technology

    1989-12-01

    Migraine, all cases 300 Meningo-encephalitis, complicated 301 Meningo-encephalitis, uncomplicated 302 Mumps, all cases 303 Infectious mononucleosis , all...MUMPS 072XX INFECTIOUS MONONUCLEOSIS 075XX TRACHOMA 076XXC 077,%X 13910 ICD9 diagnostic codes ending in XX represent entire range of five digit codes...0.00026 INFECTIOUS MONONUCLEOSIS 456 0.4 0.00357 TRACHOMA 7 0.0 0.00005 STD-SYPHILIS 48 0.0 0.00038 STD-GONOCOCCAL INFECTIONS 363 0.3 0.00284 STD

  15. Long non-coding RNAs in hepatocellular carcinoma: Potential roles and clinical implications

    PubMed Central

    Niu, Zhao-Shan; Niu, Xiao-Jun; Wang, Wen-Hong

    2017-01-01

    Long non-coding RNAs (lncRNAs) are a subgroup of non-coding RNA transcripts greater than 200 nucleotides in length with little or no protein-coding potential. Emerging evidence indicates that lncRNAs may play important regulatory roles in the pathogenesis and progression of human cancers, including hepatocellular carcinoma (HCC). Certain lncRNAs may be used as diagnostic or prognostic markers for HCC, a serious malignancy with increasing morbidity and high mortality rates worldwide. Therefore, elucidating the functional roles of lncRNAs in tumors can contribute to a better understanding of the molecular mechanisms of HCC and may help in developing novel therapeutic targets. In this review, we summarize the recent progress regarding the functional roles of lncRNAs in HCC and explore their clinical implications as diagnostic or prognostic biomarkers and molecular therapeutic targets for HCC. PMID:28932078

  16. Validation of administrative data used for the diagnosis of upper gastrointestinal events following nonsteroidal anti-inflammatory drug prescription.

    PubMed

    Abraham, N S; Cohen, D C; Rivers, B; Richardson, P

    2006-07-15

    To validate veterans affairs (VA) administrative data for the diagnosis of nonsteroidal anti-inflammatory drug (NSAID)-related upper gastrointestinal events (UGIE) and to develop a diagnostic algorithm. A retrospective study of veterans prescribed an NSAID as identified from the national pharmacy database merged with in-patient and out-patient data, followed by primary chart abstraction. Contingency tables were constructed to allow comparison with a random sample of patients prescribed an NSAID, but without UGIE. Multivariable logistic regression analysis was used to derive a predictive algorithm. Once derived, the algorithm was validated in a separate cohort of veterans. Of 906 patients, 606 had a diagnostic code for UGIE; 300 were a random subsample of 11 744 patients (control). Only 161 had a confirmed UGIE. The positive predictive value (PPV) of diagnostic codes was poor, but improved from 27% to 51% with the addition of endoscopic procedural codes. The strongest predictors of UGIE were an in-patient ICD-9 code for gastric ulcer, duodenal ulcer and haemorrhage combined with upper endoscopy. This algorithm had a PPV of 73% when limited to patients >or=65 years (c-statistic 0.79). Validation of the algorithm revealed a PPV of 80% among patients with an overlapping NSAID prescription. NSAID-related UGIE can be assessed using VA administrative data. The optimal algorithm includes an in-patient ICD-9 code for gastric or duodenal ulcer and gastrointestinal bleeding combined with a procedural code for upper endoscopy.

  17. A solar tornado observed by EIS. Plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Levens, P. J.; Labrosse, N.; Fletcher, L.; Schmieder, B.

    2015-10-01

    Context. The term "solar tornadoes" has been used to describe apparently rotating magnetic structures above the solar limb, as seen in high resolution images and movies from the Atmospheric Imaging Assembly (AIA) aboard the Solar Dynamics Observatory (SDO). These often form part of the larger magnetic structure of a prominence, however the links between them remain unclear. Here we present plasma diagnostics on a tornado-like structure and its surroundings, seen above the limb by the Extreme-ultraviolet Imaging Spectrometer (EIS) aboard the Hinode satellite. Aims: We aim to extend our view of the velocity patterns seen in tornado-like structures with EIS to a wider range of temperatures and to use density diagnostics, non-thermal line widths, and differential emission measures to provide insight into the physical characteristics of the plasma. Methods: Using Gaussian fitting to fit and de-blend the spectral lines seen by EIS, we calculated line-of-sight velocities and non-thermal line widths. Along with information from the CHIANTI database, we used line intensity ratios to calculate electron densities at each pixel. Using a regularised inversion code we also calculated the differential emission measure (DEM) at different locations in the prominence. Results: The split Doppler-shift pattern is found to be visible down to a temperature of around log T = 6.0. At temperatures lower than this, the pattern is unclear in this data set. We obtain an electron density of log ne = 8.5 when looking towards the centre of the tornado structure at a plasma temperature of log T = 6.2, as compared to the surroundings of the tornado structure where we find log ne to be nearer 9. Non-thermal line widths show broader profiles at the tornado location when compared to the surrounding corona. We discuss the differential emission measure in both the tornado and the prominence body, which suggests that there is more contribution in the tornado at temperatures below log T = 6.0 than in the prominence. A movie is available in electronic form at http://www.aanda.org

  18. Depathologising gender diversity in childhood in the process of ICD revision and reform.

    PubMed

    Suess Schwend, Amets; Winter, Sam; Chiam, Zhan; Smiley, Adam; Cabral Grinspan, Mauro

    2018-01-24

    From 2007 on, the World Health Organisation (WHO) has been revising its diagnostic manual, the International Statistical Classification of Diseases and Related Health Problems (ICD), with approval of ICD-11 due in 2018. The ICD revision has prompted debates on diagnostic classifications related to gender diversity and gender development processes, and specifically on the 'Gender incongruence of childhood' (GIC) code. These debates have taken place at a time an emergent trans depathologisation movement is becoming increasingly international, and regional and international human rights bodies are recognising gender identity as a source of discrimination. With reference to the history of diagnostic classification of gender diversity in childhood, this paper conducts a literature review of academic, activist and institutional documents related to the current discussion on the merits of retaining or abandoning the GIC code. Within this broader discussion, the paper reviews in more detail recent publications arguing for the abandonment of this diagnostic code drawing upon clinical, bioethical and human rights perspectives. The review indicates that gender diverse children engaged in exploring their gender identity and expression do not benefit from diagnosis. Instead they benefit from support from their families, their schools and from society more broadly.

  19. Testing hydrodynamics schemes in galaxy disc simulations

    NASA Astrophysics Data System (ADS)

    Few, C. G.; Dobbs, C.; Pettitt, A.; Konstandin, L.

    2016-08-01

    We examine how three fundamentally different numerical hydrodynamics codes follow the evolution of an isothermal galactic disc with an external spiral potential. We compare an adaptive mesh refinement code (RAMSES), a smoothed particle hydrodynamics code (SPHNG), and a volume-discretized mesh-less code (GIZMO). Using standard refinement criteria, we find that RAMSES produces a disc that is less vertically concentrated and does not reach such high densities as the SPHNG or GIZMO runs. The gas surface density in the spiral arms increases at a lower rate for the RAMSES simulations compared to the other codes. There is also a greater degree of substructure in the SPHNG and GIZMO runs and secondary spiral arms are more pronounced. By resolving the Jeans length with a greater number of grid cells, we achieve more similar results to the Lagrangian codes used in this study. Other alterations to the refinement scheme (adding extra levels of refinement and refining based on local density gradients) are less successful in reducing the disparity between RAMSES and SPHNG/GIZMO. Although more similar, SPHNG displays different density distributions and vertical mass profiles to all modes of GIZMO (including the smoothed particle hydrodynamics version). This suggests differences also arise which are not intrinsic to the particular method but rather due to its implementation. The discrepancies between codes (in particular, the densities reached in the spiral arms) could potentially result in differences in the locations and time-scales for gravitational collapse, and therefore impact star formation activity in more complex galaxy disc simulations.

  20. Fibromyalgia: prevalence, course, and co-morbidities in hospitalized patients in the United States, 1999-2007.

    PubMed

    Haviland, M G; Banta, J E; Przekop, P

    2011-01-01

    To evaluate hospitalisation data for patients with a primary or secondary fibromyalgia (FM) diagnosis. We estimated the number of men and women with an FM diagnostic code and compared them across a number of demographic and hospitalisation characteristics; examined age-specific, population-based FM hospitalisation rates; and determined the most common co-morbid diagnoses when FM was either the primary or secondary diagnostic code. Hospital discharge data from the Nationwide Inpatient Sample (NIS) were used. Records were evaluated between 1999 and 2007 that contained the International Classification of Diseases, 9th Revision, Clinical Modification FM diagnostic code (729.1, Myositis and Myalgia, unspecified), the FM criterion used in large-scale health services studies. There were 1,727,765 discharges with a 729.1 diagnostic code (FM) during this nine-year span, 213,034 men (12.3%) and 1,513,995 women (87.6%). Discharges coded for FM increased steadily each year. The population-based rate of male FM discharges rose gradually across the lifespan; the rate for women rose sharply but then declined after age 64. Few differences between men and women across demographic and hospitalisation characteristics were evident. The most common co-morbidities with FM as the primary diagnosis were non-specific chest pain, mood disorders, and Spondylosis/intervertebral disc disorders/other back problems. Most common primary diagnoses, with FM as a secondary diagnosis, were essential hypertension, disorders of lipid metabolism, coronary atherosclerosis/other heart disease, and mental disorders. A substantial number of U.S. residents with FM were hospitalised over the study period. Further analysis of hospitalisation data from patients with FM may provide guidance for both research and treatment, with the goal of improved care for FM patients.

  1. Synthetic neutron camera and spectrometer in JET based on AFSI-ASCOT simulations

    NASA Astrophysics Data System (ADS)

    Sirén, P.; Varje, J.; Weisen, H.; Koskela, T.; contributors, JET

    2017-09-01

    The ASCOT Fusion Source Integrator (AFSI) has been used to calculate neutron production rates and spectra corresponding to the JET 19-channel neutron camera (KN3) and the time-of-flight spectrometer (TOFOR) as ideal diagnostics, without detector-related effects. AFSI calculates fusion product distributions in 4D, based on Monte Carlo integration from arbitrary reactant distribution functions. The distribution functions were calculated by the ASCOT Monte Carlo particle orbit following code for thermal, NBI and ICRH particle reactions. Fusion cross-sections were defined based on the Bosch-Hale model and both DD and DT reactions have been included. Neutrons generated by AFSI-ASCOT simulations have already been applied as a neutron source of the Serpent neutron transport code in ITER studies. Additionally, AFSI has been selected to be a main tool as the fusion product generator in the complete analysis calculation chain: ASCOT - AFSI - SERPENT (neutron and gamma transport Monte Carlo code) - APROS (system and power plant modelling code), which encompasses the plasma as an energy source, heat deposition in plant structures as well as cooling and balance-of-plant in DEMO applications and other reactor relevant analyses. This conference paper presents the first results and validation of the AFSI DD fusion model for different auxiliary heating scenarios (NBI, ICRH) with very different fast particle distribution functions. Both calculated quantities (production rates and spectra) have been compared with experimental data from KN3 and synthetic spectrometer data from ControlRoom code. No unexplained differences have been observed. In future work, AFSI will be extended for synthetic gamma diagnostics and additionally, AFSI will be used as part of the neutron transport calculation chain to model real diagnostics instead of ideal synthetic diagnostics for quantitative benchmarking.

  2. The Proposed MACRA/MIPS Threshold for Patient-Facing Encounters: What It Means for Radiologists.

    PubMed

    Rosenkrantz, Andrew B; Hirsch, Joshua A; Allen, Bibb; Wang, Wenyi; Hughes, Danny R; Nicola, Gregory N

    2017-03-01

    In implementing the Merit-Based Incentive Payment System (MIPS), CMS will provide special considerations to physicians with infrequent face-to-face patient encounters by reweighting MIPS performance categories to account for the unique circumstances facing these providers. The aim of this study was to determine the impact of varying criteria on the fraction of radiologists who are likely to receive special considerations for performance assessment under MIPS. Data from the 2014 Medicare Physician and Other Supplier file for 28,710 diagnostic radiologists were used to determine the fraction of radiologists meeting various proposed criteria for receiving special considerations. For each definition, the fraction of patient-facing encounters among all billed codes was determined for those radiologists not receiving special considerations. When using the criterion proposed by CMS that physicians will receive special considerations if billing ≤25 evaluation and management services or surgical codes, 72.0% of diagnostic radiologists would receive special considerations, though such encounters would represent only 2.1% of billed codes among remaining diagnostic radiologists without special considerations. If CMS were to apply an alternative criterion of billing ≤100 evaluation and management codes exclusively, 98.8% of diagnostic radiologists would receive special considerations. At this threshold, patient-facing encounters would represent approximately 10% of billed codes among remaining radiologists without special considerations. The current CMS proposed criterion for special considerations would result in a considerable fraction of radiologists being evaluated on the basis of measures that are not reflective of their practice and beyond their direct control. Alternative criteria could help ensure that radiologists are provided a fair opportunity for success in performance review under the MIPS. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. Assessment of radiological protection systems among diagnostic radiology facilities in North East India.

    PubMed

    Singh, Thokchom Dewan; Jayaraman, T; Arunkumar Sharma, B

    2017-03-01

    This study aims to assess the adequacy level of radiological protection systems available in the diagnostic radiology facilities located in three capital cities of North East (NE) India. It further attempts to understand, using a multi-disciplinary approach, how the safety codes/standards in diagnostic radiology framed by the Atomic Energy Regulatory Board (AERB) and the International Atomic Energy Agency (IAEA) to achieve adequate radiological protection in facilities, have been perceived, conceptualized, and applied accordingly in these facilities. About 30 diagnostic radiology facilities were randomly selected from three capitals of states in NE India; namely Imphal (Manipur), Shillong (Meghalaya) and Guwahati (Assam). A semi-structured questionnaire developed based on a multi-disciplinary approach was used for this study. It was observed that radiological practices undertaken in these facilities were not exactly in line with safety codes/standards in diagnostic radiology of the AERB and the IAEA. About 50% of the facilities had registered/licensed x-ray equipment with the AERB. More than 80% of the workers did not use radiation protective devices, although these devices were available in the facilities. About 85% of facilities had no institutional risk management system. About 70% of the facilities did not carry out periodic quality assurance testing of their x-ray equipment or surveys of radiation leakage around the x-ray room, and did not display radiation safety indicators in the x-ray rooms. Workers in these facilities exhibited low risk perception about the risks associated with these practices. The majority of diagnostic radiology facilities in NE India did not comply with the radiological safety codes/standards framed by the AERB and IAEA. The study found inadequate levels of radiological protection systems in the majority of facilities. This study suggests a need to establish firm measures that comply with the radiological safety codes/standards of the AERB and IAEA to protect patients, workers and the public of this region.

  4. 40 CFR 1048.110 - How must my engines diagnose malfunctions?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., the MIL may stay off during later engine operation. (d) Store trouble codes in computer memory. Record and store in computer memory any diagnostic trouble codes showing a malfunction that should illuminate...

  5. Non-coding RNAs in lung cancer

    PubMed Central

    Ricciuti, Biagio; Mecca, Carmen; Crinò, Lucio; Baglivo, Sara; Cenci, Matteo; Metro, Giulio

    2014-01-01

    The discovery that protein-coding genes represent less than 2% of all human genome, and the evidence that more than 90% of it is actively transcribed, changed the classical point of view of the central dogma of molecular biology, which was always based on the assumption that RNA functions mainly as an intermediate bridge between DNA sequences and protein synthesis machinery. Accumulating data indicates that non-coding RNAs are involved in different physiological processes, providing for the maintenance of cellular homeostasis. They are important regulators of gene expression, cellular differentiation, proliferation, migration, apoptosis, and stem cell maintenance. Alterations and disruptions of their expression or activity have increasingly been associated with pathological changes of cancer cells, this evidence and the prospect of using these molecules as diagnostic markers and therapeutic targets, make currently non-coding RNAs among the most relevant molecules in cancer research. In this paper we will provide an overview of non-coding RNA function and disruption in lung cancer biology, also focusing on their potential as diagnostic, prognostic and predictive biomarkers. PMID:25593996

  6. First ERO2.0 modeling of Be erosion and non-local transport in JET ITER-like wall

    NASA Astrophysics Data System (ADS)

    Romazanov, J.; Borodin, D.; Kirschner, A.; Brezinsek, S.; Silburn, S.; Huber, A.; Huber, V.; Bufferand, H.; Firdaouss, M.; Brömmel, D.; Steinbusch, B.; Gibbon, P.; Lasa, A.; Borodkina, I.; Eksaeva, A.; Linsmeier, Ch; Contributors, JET

    2017-12-01

    ERO is a Monte-Carlo code for modeling plasma-wall interaction and 3D plasma impurity transport for applications in fusion research. The code has undergone a significant upgrade (ERO2.0) which allows increasing the simulation volume in order to cover the entire plasma edge of a fusion device, allowing a more self-consistent treatment of impurity transport and comparison with a larger number and variety of experimental diagnostics. In this contribution, the physics-relevant technical innovations of the new code version are described and discussed. The new capabilities of the code are demonstrated by modeling of beryllium (Be) erosion of the main wall during JET limiter discharges. Results for erosion patterns along the limiter surfaces and global Be transport including incident particle distributions are presented. A novel synthetic diagnostic, which mimics experimental wide-angle 2D camera images, is presented and used for validating various aspects of the code, including erosion, magnetic shadowing, non-local impurity transport, and light emission simulation.

  7. Using a short-pulse diffraction-limited laser beam to probe filamentation of a random phase plate smoothed beam.

    PubMed

    Kline, J L; Montgomery, D S; Flippo, K A; Johnson, R P; Rose, H A; Shimada, T; Williams, E A

    2008-10-01

    A short pulse (few picoseconds) laser probe provides high temporal resolution measurements to elucidate details of fast dynamic phenomena not observable with typical longer laser pulse probes and gated diagnostics. Such a short pulse laser probe (SPLP) has been used to measure filamentation of a random phase plate (RPP) smoothed laser beam in a gas-jet plasma. The plasma index of refraction due to driven density and temperature fluctuations by the RPP beam perturbs the phase front of a SPLP propagating at a 90 degree angle with respect to the RPP interaction beam. The density and temperature fluctuations are quasistatic on the time scale of the SPLP (approximately 2 ps). The transmitted near-field intensity distribution from the SPLP provides a measure of the phase front perturbation. At low plasma densities, the transmitted intensity pattern is asymmetric with striations across the entire probe beam in the direction of the RPP smoothed beam. As the plasma density increases, the striations break up into smaller sizes along the direction of the RPP beam propagation. The breakup of the intensity pattern is consistent with self-focusing of the RPP smoothed interaction beam. Simulations of the experiment using the wave propagation code, PF3D, are in qualitative agreement demonstrating that the asymmetric striations can be attributed to the RPP driven density fluctuations. Quantification of the beam breakup measured by the transmitted SPLP could lead to a new method for measuring self-focusing of lasers in underdense plasmas.

  8. Measurement and simulation of passive fast-ion D-alpha emission from the DIII-D tokamak

    DOE PAGES

    Bolte, Nathan G.; Heidbrink, William W.; Pace, David; ...

    2016-09-14

    Spectra of passive fast-ion D-alpha (FIDA) light from beam ions that charge exchange with background neutrals are measured and simulated. The fast ions come from three sources: ions that pass through the diagnostic sightlines on their first full orbit, an axisymmetric confined population, and ions that are expelled into the edge region by instabilities. A passive FIDA simulation (P-FIDASIM) is developed as a forward model for the spectra of the first-orbit fast ions and consists of an experimentally-validated beam deposition model, an ion orbit-following code, a collisional-radiative model, and a synthetic spectrometer. Model validation consists of the simulation of 86more » experimental spectra that are obtained using 6 different neutral beam fast-ion sources and 13 different lines of sight. Calibrated spectra are used to estimate the neutral density throughout the cross-section of the tokamak. The resulting 2D neutral density shows the expected increase toward each X-point with average neutral densities of 8 X 10 9 cm -3 at the plasma boundary and 1 X 10 11 cm -3 near the wall. Here, fast ions that are on passing orbits are expelled by the sawtooth instability more readily than trapped ions. In a sample discharge, approximately 1% of the fast-ion population is ejected into the high neutral density region per sawtooth crash.« less

  9. Phonological Codes Constrain Output of Orthographic Codes via Sublexical and Lexical Routes in Chinese Written Production

    PubMed Central

    Wang, Cheng; Zhang, Qingfang

    2015-01-01

    To what extent do phonological codes constrain orthographic output in handwritten production? We investigated how phonological codes constrain the selection of orthographic codes via sublexical and lexical routes in Chinese written production. Participants wrote down picture names in a picture-naming task in Experiment 1or response words in a symbol—word associative writing task in Experiment 2. A sublexical phonological property of picture names (phonetic regularity: regular vs. irregular) in Experiment 1and a lexical phonological property of response words (homophone density: dense vs. sparse) in Experiment 2, as well as word frequency of the targets in both experiments, were manipulated. A facilitatory effect of word frequency was found in both experiments, in which words with high frequency were produced faster than those with low frequency. More importantly, we observed an inhibitory phonetic regularity effect, in which low-frequency picture names with regular first characters were slower to write than those with irregular ones, and an inhibitory homophone density effect, in which characters with dense homophone density were produced more slowly than those with sparse homophone density. Results suggested that phonological codes constrained handwritten production via lexical and sublexical routes. PMID:25879662

  10. Thermal imaging diagnostics of high-current electron beams.

    PubMed

    Pushkarev, A; Kholodnaya, G; Sazonov, R; Ponomarev, D

    2012-10-01

    The thermal imaging diagnostics of measuring pulsed electron beam energy density is presented. It provides control of the electron energy spectrum and a measure of the density distribution of the electron beam cross section, the spatial distribution of electrons with energies in the selected range, and the total energy of the electron beam. The diagnostics is based on the thermal imager registration of the imaging electron beam thermal print in a material with low bulk density and low thermal conductivity. Testing of the thermal imaging diagnostics has been conducted on a pulsed electron accelerator TEU-500. The energy of the electrons was 300-500 keV, the density of the electron current was 0.1-0.4 kA/cm(2), the duration of the pulse (at half-height) was 60 ns, and the energy in the pulse was up to 100 J. To register the thermal print, a thermal imager Fluke-Ti10 was used. Testing showed that the sensitivity of a typical thermal imager provides the registration of a pulsed electron beam heat pattern within one pulse with energy density over 0.1 J/cm(2) (or with current density over 10 A/cm(2), pulse duration of 60 ns and electron energy of 400 keV) with the spatial resolution of 0.9-1 mm. In contrast to the method of using radiosensitive (dosimetric) materials, thermal imaging diagnostics does not require either expensive consumables, or plenty of processing time.

  11. Validation of a coding algorithm for intra-abdominal surgeries and adhesion-related complications in an electronic medical records database

    PubMed Central

    Scott, Frank I; Mamtani, Ronac; Haynes, Kevin; Goldberg, David S; Mahmoud, Najjia N.; Lewis, James D

    2016-01-01

    PURPOSE Epidemiological data on adhesion-related complications following intra-abdominal surgery are limited. We tested the accuracy of recording of these surgeries and complications within The Health Improvement Network (THIN), a primary care database within the United Kingdom. METHODS Individuals within THIN from 1995–2011 with an incident intra-abdominal surgery and subsequent bowel obstruction (SBO) or adhesiolysis were identified using diagnostic codes. To compute positive predictive values (PPVs), requests were sent to treating physicians of patients with these diagnostic codes to confirm the surgery, SBO, or adhesiolysis code. Completeness of recording was estimated by comparing observed surgical rates within THIN to expected rates derived from the Hospital Episode Statistics (HES) dataset within England. Cumulative incidence rates of adhesion-related complications at 5 years were compared to a previously published cohort within Scotland. RESULTS 217 of 245 (89%) questionnaires were returned (180 SBO and 37 adhesiolysis). The PPV of codes for surgery was 94.5% (95%CI: 91–97%). 88.8% of procedure types were correctly coded. The PPV for SBO and adhesiolysis was 86.1% (95% CI: 80–91%) and 89.2% (95% CI: 75–97%), respectively. Colectomy, appendectomy, and cholecystectomy rates within THIN were 99%, 95%, and 84% of rates observed in national HES data, respectively. Cumulative incidence rates of adhesion related complications following colectomy, appendectomy, and small bowel surgery were similar to those published previously. CONCLUSIONS Surgical procedures, SBO, and adhesiolysis can be accurately identified within THIN using diagnostic codes. THIN represents a new tool for assessing patient-specific risk factors for adhesion-related complications and long term outcomes. PMID:26860870

  12. [An update of the diagnostic coding system by the Spanish Society of Pediatric Emergencies].

    PubMed

    Benito Fernández, J; Luaces Cubells, C; Gelabert Colomé, G; Anso Borda, I

    2015-06-01

    The Quality Working Group of the Spanish Society of Pediatric Emergencies (SEUP) presents an update of the diagnostic coding list. The original list was prepared and published in Anales de Pediatría in 2000, being based on the International Coding system ICD-9-CM current at that time. Following the same methodology used at that time and based on the 2014 edition of the ICD-9-CM, 35 new codes have been added to the list, 15 have been updated, and a list of the most frequent references to trauma diagnoses in pediatrics have been provided. In the current list of diagnoses, SEUP reflects the significant changes that have taken place in Pediatric Emergency Services in the last decade. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  13. An FPGA design of generalized low-density parity-check codes for rate-adaptive optical transport networks

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2016-02-01

    Forward error correction (FEC) is as one of the key technologies enabling the next-generation high-speed fiber optical communications. In this paper, we propose a rate-adaptive scheme using a class of generalized low-density parity-check (GLDPC) codes with a Hamming code as local code. We show that with the proposed unified GLDPC decoder architecture, a variable net coding gains (NCGs) can be achieved with no error floor at BER down to 10-15, making it a viable solution in the next-generation high-speed fiber optical communications.

  14. Long Scalelength Plasmas for LPI Studies at the Nike Laser

    NASA Astrophysics Data System (ADS)

    Weaver, J. L.; Oh, J.; Bates, J. W.; Schmitt, A. J.; Kehne, D. M.; Wolford, M. F.; Obenschain, S. P.; Serlin, V.; Lehmberg, R. H.; Follett, R. K.; Shaw, J. G.; Myatt, J. F.; McKenty, P. W.; Wei, M. S.; Reynolds, H.; Williams, J.; Tsung, F.

    2017-10-01

    Studies of laser plasma instabilities (LPI) at the Nike laser have mainly used short pulses, small focal spots, and solid plastic (CH) targets that have yielded maximum gradient scalelengths below 200 microns. The current experimental effort aims to produce larger volume plasmas with 5-10x reduction in the density and velocity gradients as a platform for SBS, SRS, and TPD studies. The next campaign will concentrate on the effects of wavelength shifting and bandwidth changes on CBET in low density (5-10 mg/cm3) CH foam targets. This poster will discuss the development of this new LPI target platform based on modelling with the LPSE code developed at LLE. The presentation will also discuss alternative target schemes (e.g. exploding foils) and improvements to the LPI diagnostic suite and laser operations; for example, a new set of etalons will be available for the next campaign that should double the range of available wavelength shifting. Upgrades to the scattered light spectrometers in general use for LPI studies will also be presented. Work supported by DoE/NNSA.

  15. Tracking Filament Evolution in the Low Solar Corona Using Remote Sensing and In Situ Observations

    NASA Astrophysics Data System (ADS)

    Kocher, Manan; Landi, Enrico; Lepri, Susan. T.

    2018-06-01

    In the present work, we analyze a filament eruption associated with an interplanetary coronal mass ejection that arrived at L1 on 2011 August 5. In multiwavelength Solar Dynamic Observatory/Advanced Imaging Assembly (AIA) images, three plasma parcels within the filament were tracked at high cadence along the solar corona. A novel absorption diagnostic technique was applied to the filament material traveling along the three chosen trajectories to compute the column density and temperature evolution in time. Kinematics of the filamentary material were estimated using STEREO/Extreme Ultraviolet Imager and STEREO/COR1 observations. The Michigan Ionization Code used inputs of these density, temperature, and speed profiles for the computation of ionization profiles of the filament plasma. Based on these measurements, we conclude that the core plasma was in near ionization equilibrium, and the ionization states were still evolving at the altitudes where they were visible in absorption in AIA images. Additionally, we report that the filament plasma was heterogeneous, and the filamentary material was continuously heated as it expanded in the low solar corona.

  16. Experimental determination of the correlation properties of plasma turbulence using 2D BES systems

    NASA Astrophysics Data System (ADS)

    Fox, M. F. J.; Field, A. R.; van Wyk, F.; Ghim, Y.-c.; Schekochihin, A. A.; the MAST Team

    2017-04-01

    A procedure is presented to map from the spatial correlation parameters of a turbulent density field (the radial and binormal correlation lengths and wavenumbers, and the fluctuation amplitude) to correlation parameters that would be measured by a beam emission spectroscopy (BES) diagnostic. The inverse mapping is also derived, which results in resolution criteria for recovering correct correlation parameters, depending on the spatial response of the instrument quantified in terms of point-spread functions (PSFs). Thus, a procedure is presented that allows for a systematic comparison between theoretical predictions and experimental observations. This procedure is illustrated using the Mega-Ampere Spherical Tokamak BES system and the validity of the underlying assumptions is tested on fluctuating density fields generated by direct numerical simulations using the gyrokinetic code GS2. The measurement of the correlation time, by means of the cross-correlation time-delay method, is also investigated and is shown to be sensitive to the fluctuating radial component of velocity, as well as to small variations in the spatial properties of the PSFs.

  17. Overview of LH experiments in JET with an ITER-like wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirov, K. K.; Baranov, Yu.; Brix, M.

    2014-02-12

    An overview of the recent results of Lower Hybrid (LH) experiments at JET with the ITER-like wall (ILW) is presented. Topics relevant to LH wave coupling are addressed as well as issues related to ILW and LH system protections. LH wave coupling was studied in conditions determined by ILW recycling and operational constraints. It was concluded that LH wave coupling was not significantly affected and the pre-ILW performance could be recovered after optimising the launcher position and local gas puffing. SOL density measurements were performed using a Li-beam diagnostic. Dependencies on the D2 injection rate from the dedicated gas valve,more » the LH power and the LH launcher position were analysed. SOL density modifications due to LH were modelled by the EDGE2D code assuming SOL heating by collisional dissipation of the LH wave and/or possible ExB drifts in the SOL. The simulations matched reasonably well the measured SOL profiles. Observations of arcs and hotspots with visible and IR cameras viewing the LH launcher are presented.« less

  18. Characterisation of a MeV Bremsstrahlung x-ray source produced from a high intensity laser for high areal density object radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courtois, C.; Compant La Fontaine, A.; Bazzoli, S.

    2013-08-15

    Results of an experiment to characterise a MeV Bremsstrahlung x-ray emission created by a short (<10 ps) pulse, high intensity (1.4 × 10{sup 19} W/cm{sup 2}) laser are presented. X-ray emission is characterized using several diagnostics; nuclear activation measurements, a calibrated hard x-ray spectrometer, and dosimeters. Results from the reconstructed x-ray energy spectra are consistent with numerical simulations using the PIC and Monte Carlo codes between 0.3 and 30 MeV. The intense Bremsstrahlung x-ray source is used to radiograph an image quality indicator (IQI) heavily filtered with thick tungsten absorbers. Observations suggest that internal features of the IQI can bemore » resolved up to an external areal density of 85 g/cm{sup 2}. The x-ray source size, inferred by the radiography of a thick resolution grid, is estimated to be approximately 400 μm (full width half maximum of the x-ray source Point Spread Function)« less

  19. Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More

    NASA Technical Reports Server (NTRS)

    Kou, Yu; Lin, Shu; Fossorier, Marc

    1999-01-01

    Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.

  20. Low-density parity-check codes for volume holographic memory systems.

    PubMed

    Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali

    2003-02-10

    We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.

  1. Electron Temperature Fluctuation Measurements and Transport Model Validation at Alcator C-Mod

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Anne

    The tokamak is a type of toroidal device used to confine a fusion plasma using large magnetic fields. Tokamaks and stellarators the leading devices for confining plasmas for fusion, and the capability to predict performance in these magnetically confined plasmas is essential for developing a sustainable fusion energy source. The magnetic configuration of tokamaks and stellarators does not exist in Nature, yet, the fundamental processes governing transport in fusion plasmas are universal – turbulence and instabilities, driven by inhomogeneity and asymmetry in the plasma, conspire to transport heat and particles across magnetic field lines and can play critical roles inmore » impurity confinement and generation of intrinsic rotation. Turbulence exists in all plasmas, and in neutral fluids as well. The study of turbulence is essential to developing a fundamental understanding of the nature of the fourth state of matter, plasmas. Experimental studies of turbulence in tokamaks date back to early scattering observations from the late 1970s. Since that time, great advances in turbulence diagnostics have been made, all of which have significantly enhanced our knowledge and understanding of turbulence in tokamaks. Through comparisons with advanced gyrokinetic theory and turbulent-transport models a great deal of evidence exists to implicate turbulent-driven transport as an important mechanism determining transport in all channels: heat, particle and momentum However, prediction and control of turbulent-driven transport remains elusive. Key to development of predictive transport models for magnetically confined fusion plasmas is validation of the nonlinear gyrokinetic transport model, which describes transport due to turbulence. Validation of gyrokinetic codes must include detailed and quantitative comparisons with measured turbulence characteristics, in addition to comparisons with inferred transport levels and equilibrium profiles. For this reason, advanced plasma diagnostics for studying core turbulence are needed in order to assess the accuracy of gyrokinetic models for turbulent-driven particle, heat and momentum transport. New core turbulence diagnostics at the world-class tokamaks Alcator C-Mod at MIT and ASDEX Upgrade at the Max Planck Institute for Plasma Physics have been designed, developed, and operated over the course of this project. These new instruments are capable of measuring electron temperature fluctuations and the phase angle between density and temperature fluctuations locally and quantitatively. These new data sets from Alcator C-Mod and ASDEX Upgrade are being used to fill key gaps in our understanding of turbulent transport in tokamaks. In particular, this project has results in new results on the topics of the Transport Shortfall, the role of ETG turbulence in tokamak plasmas, profile stiffness, the LOC/SOC transition, and intrinsic rotation reversals. These data are used in a rigorous process of “Transport model validation”, and this group is a world-leader on using turbulence models to design new hardware and new experiments at tokamaks. A correlation electron cyclotron emission (CECE) diagnostic is an instrument used to measure micro-scale fluctuations (mm-scale, compared to the machine size of meters) of electron temperature in magnetically confined fusion plasmas, such as those in tokamaks and stellarators. These micro-scale fluctuations are associated with drift-wave type turbulence, which leads to enhanced cooling and mixing of particles in fusion plasmas and limits achieving the required temperatures and densities for self-sustained fusion reactions. A CECE system can also be coupled with a reflectometer system that measured micro-scale density fluctuations, and from these simultaneous measurements, one can extract the phase between the density (n) and temperature (T) fluctuations, creating an nT phase diagnostic. Measurements of the fluctuations and the phase angle between them are extremely useful for testing and validating predictive models for the transport of heat and particles in fusion plasmas due to turbulence. Once validated, the models are used to predict performance in ITER and other burning plasmas, such as the MIT ARC design. Most recently, data from the newly developed, so-called “CECE diagnostic” [Cima 1995, White 2008] and “nT phase angle measurements” [Haese 1999, White 2010] ]will be combined with data from density fluctuation diagnostics at ASDEX Upgrade to support a long-term program of physics research in turbulence and transport that will allow for more stringent testing and validation of gyrokinetic turbulent-transport codes. This work directly impacts the development of predictive transport models in the U.S. FES program, such as TGLF, developed by General Atomics, which are used to predict performance in ITER and other burning plasma devices as part of advancing the development of fusion energy sciences.« less

  2. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  3. Metalloid Aluminum Clusters with Fluorine

    DTIC Science & Technology

    2016-12-01

    molecular dynamics, binding energy , siesta code, density of states, projected density of states 15. NUMBER OF PAGES 69 16. PRICE CODE 17. SECURITY...high energy density compared to explosives, but typically release this energy slowly via diffusion-limited combustion. There is recent interest in using...examine the cluster binding energy and electronic structure. Partial fluorine substitution in a prototypical aluminum-cyclopentadienyl cluster results

  4. Secure web-based invocation of large-scale plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.

    2004-12-01

    We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.

  5. Recent gyrokinetic turbulence insights with GENE and direct comparison with experimental measurements

    NASA Astrophysics Data System (ADS)

    Goerler, Tobias

    2017-10-01

    Throughout the last years direct comparisons between gyrokinetic turbulence simulations and experimental measurements have been intensified substantially. Such studies are largely motivated by the urgent need for reliable transport predictions for future burning plasma devices and the associated necessity for validating the numerical tools. On the other hand, they can be helpful to assess the way a particular diagnostic experiences turbulence and provide ideas for further optimization and the physics that may not yet be accessible. Here, synthetic diagnostics, i.e. models that mimic the spatial and sometimes temporal response of the experimental diagnostic, play an important role. In the contribution at hand, we focus on recent gyrokinetic GENE simulations dedicated to ASDEX Upgrade L-mode plasmas and comparison with various turbulence measurements. Particular emphasis will be given to density fluctuation spectra which are experimentally accessible via Doppler reflectometry. A sophisticated synthetic diagnostic involving a fullwave code has recently been established and solves the long-lasting question on different spectral roll-overs in gyrokinetic and measured spectra as well as the potentially different power laws in the O- and X-mode signals. The demonstrated agreement furthermore extends the validation data base deep into spectral space and confirms a proper coverage of the turbulence cascade physics. The flux-matched GENE simulations are then used to study the sensitivity of the latter to the main microinstability drive and investigate the energetics at the various scales. Additionally, electron scale turbulence based modifications of the high-k power law spectra in such plasmas will be presented and their visibility in measurable signals be discussed.

  6. Solar Prominence Modelling and Plasma Diagnostics at ALMA Wavelengths

    NASA Astrophysics Data System (ADS)

    Rodger, Andrew; Labrosse, Nicolas

    2017-09-01

    Our aim is to test potential solar prominence plasma diagnostics as obtained with the new solar capability of the Atacama Large Millimeter/submillimeter Array (ALMA). We investigate the thermal and plasma diagnostic potential of ALMA for solar prominences through the computation of brightness temperatures at ALMA wavelengths. The brightness temperature, for a chosen line of sight, is calculated using the densities of electrons, hydrogen, and helium obtained from a radiative transfer code under non-local thermodynamic equilibrium (non-LTE) conditions, as well as the input internal parameters of the prominence model in consideration. Two distinct sets of prominence models were used: isothermal-isobaric fine-structure threads, and large-scale structures with radially increasing temperature distributions representing the prominence-to-corona transition region. We compute brightness temperatures over the range of wavelengths in which ALMA is capable of observing (0.32 - 9.6 mm), however, we particularly focus on the bands available to solar observers in ALMA cycles 4 and 5, namely 2.6 - 3.6 mm (Band 3) and 1.1 - 1.4 mm (Band 6). We show how the computed brightness temperatures and optical thicknesses in our models vary with the plasma parameters (temperature and pressure) and the wavelength of observation. We then study how ALMA observables such as the ratio of brightness temperatures at two frequencies can be used to estimate the optical thickness and the emission measure for isothermal and non-isothermal prominences. From this study we conclude that for both sets of models, ALMA presents a strong thermal diagnostic capability, provided that the interpretation of observations is supported by the use of non-LTE simulation results.

  7. Risk of preterm birth by subtype among Medi-Cal participants with mental illness.

    PubMed

    Baer, Rebecca J; Chambers, Christina D; Bandoli, Gretchen; Jelliffe-Pawlowski, Laura L

    2016-10-01

    Previous studies have demonstrated an association between mental illness and preterm birth (before 37 weeks). However, these investigations have not simultaneously considered gestation of preterm birth, the indication (eg, spontaneous or medically indicated), and specific mental illness classifications. The objective of the study was to examine the likelihood of preterm birth across gestational lengths and indications among Medi-Cal (California's Medicaid program) participants with a diagnostic code for mental illness. Mental illnesses were studied by specific illness classification. The study population was drawn from singleton live births in California from 2007 through 2011 in the birth cohort file maintained by the California Office of Statewide Health Planning and Development, which includes birth certificate and hospital discharge records. The sample was restricted to women with Medi-Cal coverage for prenatal care. Women with mental illness were identified using International Classification of Diseases, ninth revision, codes from their hospital discharge record. Women without a mental illness International Classification of Diseases, ninth revision, code were randomly selected at a 4:1 ratio. Adjusting for maternal characteristics and obstetric complications, relative risks and 95% confidence intervals were calculated for preterm birth comparing women with a mental illness diagnostic code with women without such a code. We identified 6198 women with a mental illness diagnostic code and selected 24,792 women with no such code. The risk of preterm birth in women with a mental illness were 1.2 times higher than women without a mental illness (adjusted relative risk, 1.2, 95% confidence interval, 1.1-1.3). Among the specific mental illnesses, schizophrenia, major depression, and personality disorders had the strongest associations with preterm birth (adjusted relative risks, 2.0, 2.0 and 3.3, respectively). Women receiving prenatal care through California's low-income health insurance who had at least 1 mental illness diagnostic code were 1.2-3.3-times more likely to have a preterm birth than women without a mental illness, and these risks persisted across most illness classifications. Although it cannot be determined from these data whether specific treatments for mental illness contribute to the observed associations, elevated risk across different diagnoses suggests that some aspects of mental illness itself may confer risk. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  9. The Diagnostic Potential of Fe Lines Applied to Protostellar Jets

    NASA Astrophysics Data System (ADS)

    Giannini, T.; Nisini, B.; Antoniucci, S.; Alcalá, J. M.; Bacciotti, F.; Bonito, R.; Podio, L.; Stelzer, B.; Whelan, E. T.

    2013-11-01

    We investigate the diagnostic capabilities of iron lines for tracing the physical conditions of shock-excited gas in jets driven by pre-main sequence stars. We have analyzed the 3000-25000 Å, X-shooter spectra of two jets driven by the pre-main sequence stars ESO-Hα 574 and Par-Lup 3-4. Both spectra are very rich in [Fe II] lines over the whole spectral range; in addition, lines from [Fe III] are detected in the ESO-Hα 574 spectrum. Non-local thermal equilibrium codes solving the equations of the statistical equilibrium along with codes for the ionization equilibrium are used to derive the gas excitation conditions of electron temperature and density and fractional ionization. An estimate of the iron gas-phase abundance is provided by comparing the iron lines emissivity with that of neutral oxygen at 6300 Å. The [Fe II] line analysis indicates that the jet driven by ESO-Hα 574 is, on average, colder (T e ~ 9000 K), less dense (n e ~ 2 × 104 cm-3), and more ionized (x e ~ 0.7) than the Par-Lup 3-4 jet (T e ~ 13,000 K, n e ~ 6 × 104 cm-3, x e < 0.4), even if the existence of a higher density component (n e ~ 2 × 105 cm-3) is probed by the [Fe III] and [Fe II] ultra-violet lines. The physical conditions derived from the iron lines are compared with shock models suggesting that the shock at work in ESO-Hα 574 is faster and likely more energetic than the Par-Lup 3-4 shock. This latter feature is confirmed by the high percentage of gas-phase iron measured in ESO-Hα 574 (50%-60% of its solar abundance in comparison with less than 30% in Par-Lup 3-4), which testifies that the ESO-Hα 574 shock is powerful enough to partially destroy the dust present inside the jet. This work demonstrates that a multiline Fe analysis can be effectively used to probe the excitation and ionization conditions of the gas in a jet without any assumption on ionic abundances. The main limitation on the diagnostics resides in the large uncertainties of the atomic data, which, however, can be overcome through a statistical approach involving many lines. Based on observations collected with X-shooter at the Very Large Telescope on Cerro Paranal (Chile), operated by the European Southern Observatory (ESO). Program ID: 085.C-0238(A).

  10. ABSORPTION MEASURE DISTRIBUTION IN Mrk 509

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adhikari, T. P.; Różańska, A.; Sobolewska, M.

    2015-12-20

    In this paper we model the observed absorption measure distribution (AMD) in Mrk 509, which spans three orders of magnitude in ionization level with a single-zone absorber in pressure equilibrium. AMD is usually constructed from observations of narrow absorption lines in radio-quiet active galaxies with warm absorbers. We study the properties of the warm absorber in Mrk 509 using recently published broadband spectral energy distribution observed with different instruments. This spectrum is an input in radiative transfer computations with full photoionization treatment using the titan code. We show that the simplest way to fully reproduce the shape of AMD is tomore » assume that the warm absorber is a single zone under constant total pressure. With this assumption, we found theoretical AMD that matches the observed AMD determined on the basis of the 600 ks reflection grating spectrometer XMM-Newton spectrum of Mrk 509. The softness of the source spectrum and the important role of the free–free emission breaks the usual degeneracy in the ionization state calculations, and the explicit dependence of the depths of AMD dips on density open a new path to the density diagnostic for the warm absorber. In Mrk 509, the implied density is of the order of 10{sup 8} cm{sup −3}.« less

  11. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST.

    PubMed

    Xiao, Shumei; Zang, Qing; Han, Xiaofeng; Wang, Tengfei; Yu, Jin; Zhao, Junyu

    2016-07-01

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump system can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.

  12. Visualization and analysis of pulsed ion beam energy density profile with infrared imaging

    NASA Astrophysics Data System (ADS)

    Isakova, Y. I.; Pushkarev, A. I.

    2018-03-01

    Infrared imaging technique was used as a surface temperature-mapping tool to characterize the energy density distribution of intense pulsed ion beams on a thin metal target. The technique enables the measuring of the total ion beam energy and the energy density distribution along the cross section and allows one to optimize the operation of an ion diode and control target irradiation mode. The diagnostics was tested on the TEMP-4M accelerator at TPU, Tomsk, Russia and on the TEMP-6 accelerator at DUT, Dalian, China. The diagnostics was applied in studies of the dynamics of the target cooling in vacuum after irradiation and in the experiments with target ablation. Errors caused by the target ablation and target cooling during measurements have been analyzed. For Fluke Ti10 and Fluke Ti400 infrared cameras, the technique can achieve surface energy density sensitivity of 0.05 J/cm2 and spatial resolution of 1-2 mm. The thermal imaging diagnostics does not require expensive consumed materials. The measurement time does not exceed 0.1 s; therefore, this diagnostics can be used for the prompt evaluation of the energy density distribution of a pulsed ion beam and during automation of the irradiation process.

  13. Low-Density Parity-Check Code Design Techniques to Simplify Encoding

    NASA Astrophysics Data System (ADS)

    Perez, J. M.; Andrews, K.

    2007-11-01

    This work describes a method for encoding low-density parity-check (LDPC) codes based on the accumulate-repeat-4-jagged-accumulate (AR4JA) scheme, using the low-density parity-check matrix H instead of the dense generator matrix G. The use of the H matrix to encode allows a significant reduction in memory consumption and provides the encoder design a great flexibility. Also described are new hardware-efficient codes, based on the same kind of protographs, which require less memory storage and area, allowing at the same time a reduction in the encoding delay.

  14. Time-dependent analysis of visible helium line-ratios for electron temperature and density diagnostic using synthetic simulations on NSTX-U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muñoz Burgos, J. M.; Barbui, T.; Schmitz, O.

    Helium line-ratios for electron temperature (T e) and density (n e) plasma diagnostic in the Scrape-Off-Layer (SOL) and Edge regions of tokamaks are widely used. Due to their intensities and proximity of wavelengths, the singlet 667.8 and 728.1 nm, and triplet 706.5 nm visible lines have been typically preferred. Time-dependency of the triplet line (706.5 nm) has been previously analyzed in detail by including transient effects on line-ratios during gas-puff diagnostic applications. In this work, several line-ratio combinations within each of the two spin systems are analyzed with the purpose of eliminating transient effects to extend the application of thismore » powerful diagnostic to high temporal resolution characterization of plasmas. The analysis is done using synthetic emission modeling and diagnostic for low electron density NSTX SOL plasma conditions by several visible lines. Quasi-static equilibrium, and time-dependent models are employed to evaluate transient effects of the atomic population levels that may affect the derived electron temperatures and densities as the helium gas-puff penetrates the plasma. Ultimately, the analysis of a wider range of spectral lines will help to extend this powerful diagnostic to experiments where the wavelength range of the measured spectra may be constrained either by limitations of the spectrometer, or by other conflicting lines from different ions.« less

  15. Time-dependent analysis of visible helium line-ratios for electron temperature and density diagnostic using synthetic simulations on NSTX-U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muñoz Burgos, J. M., E-mail: jmunozbu@pppl.gov; Stutman, D.; Tritz, K.

    Helium line-ratios for electron temperature (T{sub e}) and density (n{sub e}) plasma diagnostic in the Scrape-Off-Layer (SOL) and edge regions of tokamaks are widely used. Due to their intensities and proximity of wavelengths, the singlet, 667.8 and 728.1 nm, and triplet, 706.5 nm, visible lines have been typically preferred. Time-dependency of the triplet line (706.5 nm) has been previously analyzed in detail by including transient effects on line-ratios during gas-puff diagnostic applications. In this work, several line-ratio combinations within each of the two spin systems are analyzed with the purpose of eliminating transient effects to extend the application of thismore » powerful diagnostic to high temporal resolution characterization of plasmas. The analysis is done using synthetic emission modeling and diagnostic for low electron density NSTX SOL plasma conditions by several visible lines. Quasi-static equilibrium and time-dependent models are employed to evaluate transient effects of the atomic population levels that may affect the derived electron temperatures and densities as the helium gas-puff penetrates the plasma. The analysis of a wider range of spectral lines will help to extend this powerful diagnostic to experiments where the wavelength range of the measured spectra may be constrained either by limitations of the spectrometer or by other conflicting lines from different ions.« less

  16. Time-dependent analysis of visible helium line-ratios for electron temperature and density diagnostic using synthetic simulations on NSTX-U

    DOE PAGES

    Muñoz Burgos, J. M.; Barbui, T.; Schmitz, O.; ...

    2016-07-11

    Helium line-ratios for electron temperature (T e) and density (n e) plasma diagnostic in the Scrape-Off-Layer (SOL) and Edge regions of tokamaks are widely used. Due to their intensities and proximity of wavelengths, the singlet 667.8 and 728.1 nm, and triplet 706.5 nm visible lines have been typically preferred. Time-dependency of the triplet line (706.5 nm) has been previously analyzed in detail by including transient effects on line-ratios during gas-puff diagnostic applications. In this work, several line-ratio combinations within each of the two spin systems are analyzed with the purpose of eliminating transient effects to extend the application of thismore » powerful diagnostic to high temporal resolution characterization of plasmas. The analysis is done using synthetic emission modeling and diagnostic for low electron density NSTX SOL plasma conditions by several visible lines. Quasi-static equilibrium, and time-dependent models are employed to evaluate transient effects of the atomic population levels that may affect the derived electron temperatures and densities as the helium gas-puff penetrates the plasma. Ultimately, the analysis of a wider range of spectral lines will help to extend this powerful diagnostic to experiments where the wavelength range of the measured spectra may be constrained either by limitations of the spectrometer, or by other conflicting lines from different ions.« less

  17. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  18. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  19. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  20. The rotating movement of three immiscible fluids - A benchmark problem

    USGS Publications Warehouse

    Bakker, M.; Oude, Essink G.H.P.; Langevin, C.D.

    2004-01-01

    A benchmark problem involving the rotating movement of three immiscible fluids is proposed for verifying the density-dependent flow component of groundwater flow codes. The problem consists of a two-dimensional strip in the vertical plane filled with three fluids of different densities separated by interfaces. Initially, the interfaces between the fluids make a 45??angle with the horizontal. Over time, the fluids rotate to the stable position whereby the interfaces are horizontal; all flow is caused by density differences. Two cases of the problem are presented, one resulting in a symmetric flow field and one resulting in an asymmetric flow field. An exact analytical solution for the initial flow field is presented by application of the vortex theory and complex variables. Numerical results are obtained using three variable-density groundwater flow codes (SWI, MOCDENS3D, and SEAWAT). Initial horizontal velocities of the interfaces, as simulated by the three codes, compare well with the exact solution. The three codes are used to simulate the positions of the interfaces at two times; the three codes produce nearly identical results. The agreement between the results is evidence that the specific rotational behavior predicted by the models is correct. It also shows that the proposed problem may be used to benchmark variable-density codes. It is concluded that the three models can be used to model accurately the movement of interfaces between immiscible fluids, and have little or no numerical dispersion. ?? 2003 Elsevier B.V. All rights reserved.

  1. A study of tungsten spectra using large helical device and compact electron beam ion trap in NIFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morita, S.; Goto, M.; Murakami, I.

    2013-07-11

    Tungsten spectra have been observed from Large Helical Device (LHD) and Compact electron Beam Ion Trap (CoBIT) in wavelength ranges of visible to EUV. The EUV spectra with unresolved transition array (UTA), e.g., 6g-4f, 5g-4f, 5f-4d and 5p-4d transitions for W{sup +24-+33}, measured from LHD plasmas are compared with those measured from CoBIT with monoenergetic electron beam ({<=}2keV). The tungsten spectra from LHD are well analyzed based on the knowledge from CoBIT tungsten spectra. The C-R model code has been developed to explain the UTA spectra in details. Radial profiles of EUV spectra from highly ionized tungsten ions have beenmore » measured and analyzed by impurity transport simulation code with ADPAK atomic database code to examine the ionization balance determined by ionization and recombination rate coefficients. As the first trial, analysis of the tungsten density in LHD plasmas is attempted from radial profile of Zn-like WXLV (W{sup 44+}) 4p-4s transition at 60.9A based on the emission rate coefficient calculated with HULLAC code. As a result, a total tungsten ion density of 3.5 Multiplication-Sign 10{sup 10}cm{sup -3} at the plasma center is reasonably obtained. In order to observe the spectra from tungsten ions in lower-ionized charge stages, which can give useful information on the tungsten influx in fusion plasmas, the ablation cloud of the impurity pellet is directly measured with visible spectroscopy. A lot of spectra from neutral and singly ionized tungsten are observed and some of them are identified. A magnetic forbidden line from highly ionized tungsten ions has been examined and Cd-like WXXVII (W{sup 26+}) at 3893.7A is identified as the ground-term fine-structure transition of 4f{sup 23}H{sub 5}-{sup 3}H{sub 4}. The possibility of {alpha} particle diagnostic in D-T burning plasmas using the magnetic forbidden line is discussed.« less

  2. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  3. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.

    In this research, collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of qmore » = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. Lastly, this improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.« less

  4. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Herfindal, J. L.; Howell, E. C.; Knowlton, S. F.; Maurer, D. A.; Traverso, P. J.

    2018-01-01

    Collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of q = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. This improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.

  5. Vanadium fine-structure K-shell electron impact ionization cross sections for fast-electron diagnostic in laser–solid experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmeri, P., E-mail: patrick.palmeri@umons.ac.be; Quinet, P., E-mail: pascal.quinet@umons.ac.be; IPNAS, Université de Liège, B-4000 Liège

    2015-09-15

    The K-shell electron impact ionization (EII) cross section, along with the K-shell fluorescence yield, is one of the key atomic parameters for fast-electron diagnostic in laser–solid experiments through the K-shell emission cross section. In addition, in a campaign dedicated to the modeling of the K lines of astrophysical interest (Palmeri et al. (2012)), the K-shell fluorescence yields for the K-vacancy fine-structure atomic levels of all the vanadium isonuclear ions have been calculated. In this study, the K-shell EII cross sections connecting the ground and the metastable levels of the parent vanadium ions to the daughter ions K-vacancy levels considered in Palmerimore » et al. (2012) have been determined. The relativistic distorted-wave (DW) approximation implemented in the FAC atomic code has been used for the incident electron kinetic energies up to 20 times the K-shell threshold energies. Moreover, the resulting DW cross sections have been extrapolated at higher energies using the asymptotic behavior of the modified relativistic binary encounter Bethe model (MRBEB) of Guerra et al. (2012) with the density-effect correction proposed by Davies et al. (2013)« less

  6. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    DOE PAGES

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.; ...

    2018-01-31

    In this research, collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of qmore » = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. Lastly, this improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.« less

  7. Modeling experimental plasma diagnostics in the FLASH code: Thomson scattering

    NASA Astrophysics Data System (ADS)

    Weide, Klaus; Flocke, Norbert; Feister, Scott; Tzeferacos, Petros; Lamb, Donald

    2017-10-01

    Spectral analysis of the Thomson scattering of laser light sent into a plasma provides an experimental method to quantify plasma properties in laser-driven plasma experiments. We have implemented such a synthetic Thomson scattering diagnostic unit in the FLASH code, to emulate the probe-laser propagation, scattering and spectral detection. User-defined laser rays propagate into the FLASH simulation region and experience scattering (change in direction and frequency) based on plasma parameters. After scattering, the rays propagate out of the interaction region and are spectrally characterized. The diagnostic unit can be used either during a physics simulation or in post-processing of simulation results. FLASH is publicly available at flash.uchicago.edu. U.S. DOE NNSA, U.S. DOE NNSA ASC, U.S. DOE Office of Science and NSF.

  8. A comparison of different methods to implement higher order derivatives of density functionals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Dam, Hubertus J.J.

    Density functional theory is the dominant approach in electronic structure methods today. To calculate properties higher order derivatives of the density functionals are required. These derivatives might be implemented manually,by automatic differentiation, or by symbolic algebra programs. Different authors have cited different reasons for using the particular method of their choice. This paper presents work where all three approaches were used and the strengths and weaknesses of each approach are considered. It is found that all three methods produce code that is suffficiently performanted for practical applications, despite the fact that our symbolic algebra generated code and our automatic differentiationmore » code still have scope for significant optimization. The automatic differentiation approach is the best option for producing readable and maintainable code.« less

  9. Linear calculations of edge current driven kink modes with BOUT++ code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, G. Q., E-mail: ligq@ipp.ac.cn; Xia, T. Y.; Lawrence Livermore National Laboratory, Livermore, California 94550

    This work extends previous BOUT++ work to systematically study the impact of edge current density on edge localized modes, and to benchmark with the GATO and ELITE codes. Using the CORSICA code, a set of equilibria was generated with different edge current densities by keeping total current and pressure profile fixed. Based on these equilibria, the effects of the edge current density on the MHD instabilities were studied with the 3-field BOUT++ code. For the linear calculations, with increasing edge current density, the dominant modes are changed from intermediate-n and high-n ballooning modes to low-n kink modes, and the linearmore » growth rate becomes smaller. The edge current provides stabilizing effects on ballooning modes due to the increase of local shear at the outer mid-plane with the edge current. For edge kink modes, however, the edge current does not always provide a destabilizing effect; with increasing edge current, the linear growth rate first increases, and then decreases. In benchmark calculations for BOUT++ against the linear results with the GATO and ELITE codes, the vacuum model has important effects on the edge kink mode calculations. By setting a realistic density profile and Spitzer resistivity profile in the vacuum region, the resistivity was found to have a destabilizing effect on both the kink mode and on the ballooning mode. With diamagnetic effects included, the intermediate-n and high-n ballooning modes can be totally stabilized for finite edge current density.« less

  10. ORBIT: A Code for Collective Beam Dynamics in High-Intensity Rings

    NASA Astrophysics Data System (ADS)

    Holmes, J. A.; Danilov, V.; Galambos, J.; Shishlo, A.; Cousineau, S.; Chou, W.; Michelotti, L.; Ostiguy, J.-F.; Wei, J.

    2002-12-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK; the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings.

  11. Density diagnostics of ionized outflows in active galactic nuclei. X-ray and UV absorption lines from metastable levels in Be-like to C-like ions

    NASA Astrophysics Data System (ADS)

    Mao, Junjie; Kaastra, J. S.; Mehdipour, M.; Raassen, A. J. J.; Gu, Liyi; Miller, J. M.

    2017-11-01

    Context. Ionized outflows in active galactic nuclei (AGNs) are thought to influence their nuclear and local galactic environment. However, the distance of the outflows with respect to the central engine is poorly constrained, which limits our understanding of their kinetic power as a cosmic feedback channel. Therefore, the impact of AGN outflows on their host galaxies is uncertain. However, when the density of the outflows is known, their distance can be immediately obtained from their modeled ionization parameters. Aims: We perform a theoretical study of density diagnostics of ionized outflows using absorption lines from metastable levels in Be-like to C-like cosmic abundant ions. Methods: With the new self-consistent PhotoIONization (PION) model in the SPEX code, we are able to calculate detailed level populations, including the ground and metastable levels. This enables us to determine under what physical conditions the metastable levels are significantly populated. We then identify characteristic lines from these metastable levels in the 1-2000 Å wavelength range. Results: In the broad density range of nH ∈ (106, 1020) m-3, the metastable levels 2s2p (3P0-2) in Be-like ions can be significantly populated. For B-like ions, merely the first excited level 2s22p (2P3/2) can be used as a density probe. For C-like ions, the first two excited levels 2s22p2 (3P1 and 3P2) are better density probes than the next two excited levels 2s22p2 (1S0 and 1D2). Different ions in the same isoelectronic sequence cover not only a wide range of ionization parameters, but also a wide range of density values. On the other hand, within the same isonuclear sequence, those less ionized ions probe lower density and smaller ionization parameters. Finally, we reanalyzed the high-resolution grating spectra of NGC 5548 observed with Chandra in January 2002 using a set of PION components to account for the ionized outflow. We derive lower (or upper) limits of plasma density in five out of six PION components based on the presence (or absence) of the metastable absorption lines. Once atomic data from N-like to F-like ions are available, combined with the next generation of spectrometers that cover both X-ray and UV wavelength ranges with higher spectral resolution and larger effective areas, tight constraints on the density and thus the location and kinetic power of AGN outflows can be obtained.

  12. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Shumei; Zang, Qing, E-mail: zangq@ipp.ac.cn; Han, Xiaofeng

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump systemmore » can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.« less

  13. Overview of HIT-SI3 experiment: Simulations, Diagnostics, and Summary of Current Results

    NASA Astrophysics Data System (ADS)

    Penna, James; Jarboe, Thomas; Nelson, Brian; Hossack, Aaron; Sutherland, Derek; Morgan, Kyle; Hansen, Chris; Benedett, Thomas; Everson, Chris; Victor, Brian

    2016-10-01

    The Helicity Injected Torus - Steady Inductive 3(HIT-SI3)experiment forms and maintains spheromaks via Steady Inductive Helicity Injection (SIHI). Three injector units allow for continuous injection of helicity into a copper flux conserver in order to sustain a spheromak. Firing of the injectors with a phase difference allows finite rotation of the plasma to provide a stabilizing effect. Simulations in the MHD code NIMROD and the fluid-model code PSI-TET provide validation and a basis for interpretation of the observed experimental data. Thompson Scattering (TS) and Far Infrared (FIR) Interferometer systems allow temperature and line-averaged density measurements to be taken. An Ion Doppler Spectroscopy (IDS) system allows measurement of the plasma rotation and velocity. HIT-SI3 data has been used for validation of IDCD predictions, in particular the projected impedance of helicity injectors according to the theory. The experimental impedances have been calculated here for the first time for different HIT-SI3 regimes. Such experimental evidence will contribute to the design of future experiments employing IDCD as a current-drive mechanism. Work supported by the D.O.E., Office of Science, Office of Fusion Science.

  14. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, C., E-mail: hansec@uw.edu; Columbia University, New York, New York 10027; Victor, B.

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numericalmore » validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.« less

  15. Validity of International Classification of Diseases (ICD) coding for dengue infections in hospital discharge records in Malaysia.

    PubMed

    Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn

    2018-04-20

    Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.

  16. Inversion of Zeeman polarization for solar magnetic field diagnostics

    NASA Astrophysics Data System (ADS)

    Derouich, M.

    2017-05-01

    The topic of magnetic field diagnostics with the Zeeman effect is currently vividly discussed. There are some testable inversion codes available to the spectropolarimetry community and their application allowed for a better understanding of the magnetism of the solar atmosphere. In this context, we propose an inversion technique associated with a new numerical code. The inversion procedure is promising and particularly successful for interpreting the Stokes profiles in quick and sufficiently precise way. In our inversion, we fit a part of each Stokes profile around a target wavelength, and then determine the magnetic field as a function of the wavelength which is equivalent to get the magnetic field as a function of the height of line formation. To test the performance of the new numerical code, we employed "hare and hound" approach by comparing an exact solution (called input) with the solution obtained by the code (called output). The precision of the code is also checked by comparing our results to the ones obtained with the HAO MERLIN code. The inversion code has been applied to synthetic Stokes profiles of the Na D1 line available in the literature. We investigated the limitations in recovering the input field in case of noisy data. As an application, we applied our inversion code to the polarization profiles of the Fe Iλ 6302.5 Å observed at IRSOL in Locarno.

  17. Transition to international classification of disease version 10, clinical modification: the impact on internal medicine and internal medicine subspecialties.

    PubMed

    Caskey, Rachel N; Abutahoun, Angelos; Polick, Anne; Barnes, Michelle; Srivastava, Pavan; Boyd, Andrew D

    2018-05-04

    The US health care system uses diagnostic codes for billing and reimbursement as well as quality assessment and measuring clinical outcomes. The US transitioned to the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) on October, 2015. Little is known about the impact of ICD-10-CM on internal medicine and medicine subspecialists. We used a state-wide data set from Illinois Medicaid specified for Internal Medicine providers and subspecialists. A total of 3191 ICD-9-CM codes were used for 51,078 patient encounters, for a total cost of US $26,022,022 for all internal medicine. We categorized all of the ICD-9-CM codes based on the complexity of mapping to ICD-10-CM as codes with complex mapping could result in billing or administrative errors during the transition. Codes found to have complex mapping and frequently used codes (n = 295) were analyzed for clinical accuracy of mapping to ICD-10-CM. Each subspecialty was analyzed for complexity of codes used and proportion of reimbursement associated with complex codes. Twenty-five percent of internal medicine codes have convoluted mapping to ICD-10-CM, which represent 22% of Illinois Medicaid patients, and 30% of reimbursements. Rheumatology and Endocrinology had the greatest proportion of visits and reimbursement associated with complex codes. We found 14.5% of ICD-9-CM codes used by internists, when mapped to ICD-10-CM, resulted in potential clinical inaccuracies. We identified that 43% of diagnostic codes evaluated and used by internists and that account for 14% of internal medicine reimbursements are associated with codes which could result in administrative errors.

  18. Nonlinear wave vacillation in the atmosphere

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.

    1987-01-01

    The problem of vacillation in a baroclinically unstable flow field is studied through the time evolution of a single nonlinearly unstable wave. To this end a computer code is being developed to solve numerically for the time evolution of the amplitude of such a wave. The final working code will be the end product resulting from the development of a heirarchy of codes with increasing complexity. The first code in this series was completed and is undergoing several diagnostic analyses to verify its validity. The development of this code is detailed.

  19. Assessment of Optical Coherence Tomography Color Probability Codes in Myopic Glaucoma Eyes After Applying a Myopic Normative Database.

    PubMed

    Seol, Bo Ram; Kim, Dong Myung; Park, Ki Ho; Jeoung, Jin Wook

    2017-11-01

    To evaluate the optical coherence tomography (OCT) color probability codes based on a myopic normative database and to investigate whether the implementation of the myopic normative database can improve the OCT diagnostic ability in myopic glaucoma. Comparative validity study. In this study, 305 eyes (154 myopic healthy eyes and 151 myopic glaucoma eyes) were included. A myopic normative database was obtained based on myopic healthy eyes. We evaluated the agreement between OCT color probability codes after applying the built-in and myopic normative databases, respectively. Another 120 eyes (60 myopic healthy eyes and 60 myopic glaucoma eyes) were included and the diagnostic performance of OCT color codes using a myopic normative database was investigated. The mean weighted kappa (Kw) coefficients for quadrant retinal nerve fiber layer (RNFL) thickness, clock-hour RNFL thickness, and ganglion cell-inner plexiform layer (GCIPL) thickness were 0.636, 0.627, and 0.564, respectively. The myopic normative database showed a higher specificity than did the built-in normative database in quadrant RNFL thickness, clock-hour RNFL thickness, and GCIPL thickness (P < .001, P < .001, and P < .001, respectively). The receiver operating characteristic curve values increased when using the myopic normative database in quadrant RNFL thickness, clock-hour RNFL thickness, and GCIPL thickness (P = .011, P = .004, P < .001, respectively). The diagnostic ability of OCT color codes for detection of myopic glaucoma significantly improved after application of the myopic normative database. The implementation of a myopic normative database is needed to allow more precise interpretation of OCT color probability codes when used in myopic eyes. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results

    NASA Technical Reports Server (NTRS)

    Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.

    2006-01-01

    A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.

  1. Entanglement-assisted quantum quasicyclic low-density parity-check codes

    NASA Astrophysics Data System (ADS)

    Hsieh, Min-Hsiu; Brun, Todd A.; Devetak, Igor

    2009-03-01

    We investigate the construction of quantum low-density parity-check (LDPC) codes from classical quasicyclic (QC) LDPC codes with girth greater than or equal to 6. We have shown that the classical codes in the generalized Calderbank-Skor-Steane construction do not need to satisfy the dual-containing property as long as preshared entanglement is available to both sender and receiver. We can use this to avoid the many four cycles which typically arise in dual-containing LDPC codes. The advantage of such quantum codes comes from the use of efficient decoding algorithms such as sum-product algorithm (SPA). It is well known that in the SPA, cycles of length 4 make successive decoding iterations highly correlated and hence limit the decoding performance. We show the principle of constructing quantum QC-LDPC codes which require only small amounts of initial shared entanglement.

  2. [Population density, age distribution and urbanisation as factors influencing the frequency of home visits--an analysis for Mecklenburg-West Pomerania].

    PubMed

    Heymann, R; Weitmann, K; Weiss, S; Thierfelder, D; Flessa, S; Hoffmann, W

    2009-07-01

    This study examines and compares the frequency of home visits by general practitioners in regions with a lower population density and regions with a higher population density. The discussion centres on the hypothesis whether the number of home visits in rural and remote areas with a low population density is, in fact, higher than in urbanised areas with a higher population density. The average age of the population has been considered in both cases. The communities of Mecklenburg West-Pomerania were aggregated into postal code regions. The analysis is based on these postal code regions. The average frequency of home visits per 100 inhabitants/km2 has been calculated via a bivariate, linear regression model with the population density and the average age for the postal code region as independent variables. The results are based on billing data of the year 2006 as provided by the Association of Statutory Health Insurance Physicians of Mecklenburg-Western Pomerania. In a second step a variable which clustered the postal codes of urbanised areas was added to a multivariate model. The hypothesis of a negative correlation between the frequency of home visits and the population density of the areas examined cannot be confirmed for Mecklenburg-Western Pomerania. Following the dichotomisation of the postal code regions into sparsely and densely populated areas, only the very sparsely populated postal code regions (less than 100 inhabitants/km2) show a tendency towards a higher frequency of home visits. Overall, the frequency of home visits in sparsely populated postal code regions is 28.9% higher than in the densely populated postal code regions (more than 100 inhabitants/km2), although the number of general practitioners is approximately the same in both groups. In part this association seems to be confirmed by a positive correlation between the average age in the individual postal code regions and the number of home visits carried out in the area. As calculated on the basis of the data at hand, only the very sparsely populated areas with a still gradually decreasing population show a tendency towards a higher frequency of home visits. According to the data of 2006, the number of home visits remains high in sparsely populated areas. It may increase in the near future as the number of general practitioners in these areas will gradually decrease while the number of immobile and older inhabitants will increase.

  3. 2D electron density profile measurement in tokamak by laser-accelerated ion-beam probe.

    PubMed

    Chen, Y H; Yang, X Y; Lin, C; Wang, L; Xu, M; Wang, X G; Xiao, C J

    2014-11-01

    A new concept of Heavy Ion Beam Probe (HIBP) diagnostic has been proposed, of which the key is to replace the electrostatic accelerator of traditional HIBP by a laser-driven ion accelerator. Due to the large energy spread of ions, the laser-accelerated HIBP can measure the two-dimensional (2D) electron density profile of tokamak plasma. In a preliminary simulation, a 2D density profile was reconstructed with a spatial resolution of about 2 cm, and with the error below 15% in the core region. Diagnostics of 2D density fluctuation is also discussed.

  4. A comparison of data interoperability approaches of fusion codes with application to synthetic diagnostics

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.

    2010-11-01

    As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.

  5. Decoding the Emerging Patterns Exhibited in Non-coding RNAs Characteristic of Lung Cancer with Regard to their Clinical Significance.

    PubMed

    Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana

    2018-05-01

    Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.

  6. Experimental investigation of adiabatic compression and heating using collision of an MHD-driven jet with a gas target cloud for magnetized target fusion

    NASA Astrophysics Data System (ADS)

    Seo, Byonghoon; Li, Hui; Bellan, Paul

    2017-10-01

    We are studying magnetized target fusion using an experimental method where an imploding liner compressing a plasma is simulated by a high-speed MHD-driven plasma jet colliding with a gas target cloud. This has the advantage of being non-destructive so orders of magnitude more shots are possible. Since the actual density and temperature are much more modest than fusion-relevant values, the goal is to determine the scaling of the increase in density and temperature when an actual experimental plasma is adiabatically compressed. Two new-developed diagnostics are operating and providing data. The first new diagnostic is a fiber-coupled interferometer which measures line-integrated electron density not only as a function of time, but also as a function of position along the jet. The second new diagnostic is laser Thomson scattering which measures electron density and temperature at the location where the jet collides with the cloud. These diagnostics show that when the jet collides with a target cloud the jet slows down substantially and both the electron density and temperature increase. The experimental measurements are being compared with 3D MHD and hybrid kinetic numerical simulations that model the actual experimental geometry.

  7. The impacts of marijuana dispensary density and neighborhood ecology on marijuana abuse and dependence

    PubMed Central

    Mair, Christina; Freisthler, Bridget; Ponicki, William R.; Gaidus, Andrew

    2015-01-01

    Background As an increasing number of states liberalize cannabis use and develop laws and local policies, it is essential to better understand the impacts of neighborhood ecology and marijuana dispensary density on marijuana use, abuse, and dependence. We investigated associations between marijuana abuse/dependence hospitalizations and community demographic and environmental conditions from 2001–2012 in California, as well as cross-sectional associations between local and adjacent marijuana dispensary densities and marijuana hospitalizations. Methods We analyzed panel population data relating hospitalizations coded for marijuana abuse or dependence and assigned to residential ZIP codes in California from 2001 through 2012 (20,219 space-time units) to ZIP code demographic and ecological characteristics. Bayesian space-time misalignment models were used to account for spatial variations in geographic unit definitions over time, while also accounting for spatial autocorrelation using conditional autoregressive priors. We also analyzed cross-sectional associations between marijuana abuse/dependence and the density of dispensaries in local and spatially adjacent ZIP codes in 2012. Results An additional one dispensary per square mile in a ZIP code was cross-sectionally associated with a 6.8% increase in the number of marijuana hospitalizations (95% credible interval 1.033, 1.105) with a marijuana abuse/dependence code. Other local characteristics, such as the median household income and age and racial/ethnic distributions, were associated with marijuana hospitalizations in cross-sectional and panel analyses. Conclusions Prevention and intervention programs for marijuana abuse and dependence may be particularly essential in areas of concentrated disadvantage. Policy makers may want to consider regulations that limit the density of dispensaries. PMID:26154479

  8. Documentation of a numerical code for the simulation of variable density ground-water flow in three dimensions

    USGS Publications Warehouse

    Kuiper, L.K.

    1985-01-01

    A numerical code is documented for the simulation of variable density time dependent groundwater flow in three dimensions. The groundwater density, although variable with distance, is assumed to be constant in time. The Integrated Finite Difference grid elements in the code follow the geologic strata in the modeled area. If appropriate, the determination of hydraulic head in confining beds can be deleted to decrease computation time. The strongly implicit procedure (SIP), successive over-relaxation (SOR), and eight different preconditioned conjugate gradient (PCG) methods are used to solve the approximating equations. The use of the computer program that performs the calculations in the numerical code is emphasized. Detailed instructions are given for using the computer program, including input data formats. An example simulation and the Fortran listing of the program are included. (USGS)

  9. Advanced density profile reflectometry; the state-of-the-art and measurement prospects for ITER

    NASA Astrophysics Data System (ADS)

    Doyle, E. J.

    2006-10-01

    Dramatic progress in millimeter-wave technology has allowed the realization of a key goal for ITER diagnostics, the routine measurement of the plasma density profile from millimeter-wave radar (reflectometry) measurements. In reflectometry, the measured round-trip group delay of a probe beam reflected from a plasma cutoff is used to infer the density distribution in the plasma. Reflectometer systems implemented by UCLA on a number of devices employ frequency-modulated continuous-wave (FM-CW), ultrawide-bandwidth, high-resolution radar systems. One such system on DIII-D has routinely demonstrated measurements of the density profile over a range of electron density of 0-6.4x10^19,m-3, with ˜25 μs time and ˜4 mm radial resolution, meeting key ITER requirements. This progress in performance was made possible by multiple advances in the areas of millimeter-wave technology, novel measurement techniques, and improved understanding, including: (i) fast sweep, solid-state, wide bandwidth sources and power amplifiers, (ii) dual polarization measurements to expand the density range, (iii) adaptive radar-based data analysis with parallel processing on a Unix cluster, (iv) high memory depth data acquisition, and (v) advances in full wave code modeling. The benefits of advanced system performance will be illustrated using measurements from a wide range of phenomena, including ELM and fast-ion driven mode dynamics, L-H transition studies and plasma-wall interaction. The measurement capabilities demonstrated by these systems provide a design basis for the development of the main ITER profile reflectometer system. This talk will explore the extent to which these reflectometer system designs, results and experience can be translated to ITER, and will identify what new studies and experimental tests are essential.

  10. Calibrations of the LHD Thomson scattering system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamada, I., E-mail: yamadai@nifs.ac.jp; Funaba, H.; Yasuhara, R.

    2016-11-15

    The Thomson scattering diagnostic systems are widely used for the measurements of absolute local electron temperatures and densities of fusion plasmas. In order to obtain accurate and reliable temperature and density data, careful calibrations of the system are required. We have tried several calibration methods since the second LHD experiment campaign in 1998. We summarize the current status of the calibration methods for the electron temperature and density measurements by the LHD Thomson scattering diagnostic system. Future plans are briefly discussed.

  11. Calibrations of the LHD Thomson scattering system.

    PubMed

    Yamada, I; Funaba, H; Yasuhara, R; Hayashi, H; Kenmochi, N; Minami, T; Yoshikawa, M; Ohta, K; Lee, J H; Lee, S H

    2016-11-01

    The Thomson scattering diagnostic systems are widely used for the measurements of absolute local electron temperatures and densities of fusion plasmas. In order to obtain accurate and reliable temperature and density data, careful calibrations of the system are required. We have tried several calibration methods since the second LHD experiment campaign in 1998. We summarize the current status of the calibration methods for the electron temperature and density measurements by the LHD Thomson scattering diagnostic system. Future plans are briefly discussed.

  12. A novel urinary long non-coding RNA transcript improves diagnostic accuracy in patients undergoing prostate biopsy.

    PubMed

    Zhang, Wei; Ren, Shan-Cheng; Shi, Xiao-Lei; Liu, Ya-Wei; Zhu, Ya-Sheng; Jing, Tai-Le; Wang, Fu-Bo; Chen, Rui; Xu, Chuan-Liang; Wang, Hui-Qing; Wang, Hai-Feng; Wang, Yan; Liu, Bing; Li, Yao-Ming; Fang, Zi-Yu; Guo, Fei; Lu, Xin; Shen, Dan; Gao, Xu; Hou, Jian-Guo; Sun, Ying-Hao

    2015-05-01

    Long non-coding RNA (LncRNA) PCA3 has been a well-established urine biomarker for the detection of prostate cancer (PCa). Our previous study showed a novel LncRNA FR0348383 is up-regulated in over 70% of PCa compared with matched benign tissues. The aim of this study was to evaluate the diagnostic value of urinary FR0348383 for men undergoing prostate biopsy due to elevated PSA (PSA > 4.0 ng/ml) and/or abnormal digital rectal examination (DRE). Post-DRE first-catch urine specimens prior to prostate biopsies were prospectively collected. After the whole transcriptome amplification, quantitative real time polymerase chain reaction was applied to quantify urine FR0348383 and PSA levels. The FR0348383 score was calculated as the ratio of PSA and FR0348383 mRNA (PSA mRNA/FR0348383 mRNA × 1000). The diagnostic value of FR0348383 score was evaluated by logistic regression and decision curve analysis. 213 cases with urine samples containing sufficient mRNA were included, 94 cases had serum PSA level 4.0-10.0 ng/ml. PCa was identified in 72 cases. An increasing FR0348383 score was correlated with an increasing probability of a positive biopsy (P < 0.001). Multivariable logistic analysis indicated FR0348383 score (P < 0.001), PSA (P = 0.004), age (P = 0.007), prostate volume (P < 0.001) were independent predictors of PCa. ROC analysis demonstrated FR0348383 score outperformed PSA, %free PSA, and PSA Density in the prediction of PCa in the subgroup of patients with grey area PSA (AUC: 0.815 vs. 0.562 vs. 0.599 vs. 0.645). When using a probability threshold of 30% in the grey zone cohort, The FR0348383 score would save 52.0% of avoidable biopsies without missing any high grade cancers. FR0348383 transcript in post-DRE urine may be a novel biomarker for detection of PCa with great diagnostic value, especially in the grey zone cohort. The application of FR0348383 score in clinical practice might avoid unnecessary prostate biopsies and increase the specificity of PCa diagnosis. © 2015 Wiley Periodicals, Inc.

  13. Identification of novel diagnostic biomarkers for thyroid carcinoma.

    PubMed

    Wang, Xiliang; Zhang, Qing; Cai, Zhiming; Dai, Yifan; Mou, Lisha

    2017-12-19

    Thyroid carcinoma (THCA) is the most universal endocrine malignancy worldwide. Unfortunately, a limited number of large-scale analyses have been performed to identify biomarkers for THCA. Here, we conducted a meta-analysis using 505 THCA patients and 59 normal controls from The Cancer Genome Atlas. After identifying differentially expressed long non-coding RNA (lncRNA) and protein coding genes (PCG), we found vast difference in various lncRNA-PCG co-expressed pairs in THCA. A dysregulation network with scale-free topology was constructed. Four molecules (LA16c-380H5.2, RP11-203J24.8, MLF1 and SDC4) could potentially serve as diagnostic biomarkers of THCA with high sensitivity and specificity. We further represent a diagnostic panel with expression cutoff values. Our results demonstrate the potential application of those four molecules as novel independent biomarkers for THCA diagnosis.

  14. Immunochromatographic diagnostic test analysis using Google Glass.

    PubMed

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2014-03-25

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.

  15. Immunochromatographic Diagnostic Test Analysis Using Google Glass

    PubMed Central

    2014-01-01

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health. PMID:24571349

  16. Pediatric complex chronic conditions classification system version 2: updated for ICD-10 and complex medical technology dependence and transplantation

    PubMed Central

    2014-01-01

    Background The pediatric complex chronic conditions (CCC) classification system, developed in 2000, requires revision to accommodate the International Classification of Disease 10th Revision (ICD-10). To update the CCC classification system, we incorporated ICD-9 diagnostic codes that had been either omitted or incorrectly specified in the original system, and then translated between ICD-9 and ICD-10 using General Equivalence Mappings (GEMs). We further reviewed all codes in the ICD-9 and ICD-10 systems to include both diagnostic and procedural codes indicative of technology dependence or organ transplantation. We applied the provisional CCC version 2 (v2) system to death certificate information and 2 databases of health utilization, reviewed the resulting CCC classifications, and corrected any misclassifications. Finally, we evaluated performance of the CCC v2 system by assessing: 1) the stability of the system between ICD-9 and ICD-10 codes using data which included both ICD-9 codes and ICD-10 codes; 2) the year-to-year stability before and after ICD-10 implementation; and 3) the proportions of patients classified as having a CCC in both the v1 and v2 systems. Results The CCC v2 classification system consists of diagnostic and procedural codes that incorporate a new neonatal CCC category as well as domains of complexity arising from technology dependence or organ transplantation. CCC v2 demonstrated close comparability between ICD-9 and ICD-10 and did not detect significant discontinuity in temporal trends of death in the United States. Compared to the original system, CCC v2 resulted in a 1.0% absolute (10% relative) increase in the number of patients identified as having a CCC in national hospitalization dataset, and a 0.4% absolute (24% relative) increase in a national emergency department dataset. Conclusions The updated CCC v2 system is comprehensive and multidimensional, and provides a necessary update to accommodate widespread implementation of ICD-10. PMID:25102958

  17. Improved Correction of Misclassification Bias With Bootstrap Imputation.

    PubMed

    van Walraven, Carl

    2018-07-01

    Diagnostic codes used in administrative database research can create bias due to misclassification. Quantitative bias analysis (QBA) can correct for this bias, requires only code sensitivity and specificity, but may return invalid results. Bootstrap imputation (BI) can also address misclassification bias but traditionally requires multivariate models to accurately estimate disease probability. This study compared misclassification bias correction using QBA and BI. Serum creatinine measures were used to determine severe renal failure status in 100,000 hospitalized patients. Prevalence of severe renal failure in 86 patient strata and its association with 43 covariates was determined and compared with results in which renal failure status was determined using diagnostic codes (sensitivity 71.3%, specificity 96.2%). Differences in results (misclassification bias) were then corrected with QBA or BI (using progressively more complex methods to estimate disease probability). In total, 7.4% of patients had severe renal failure. Imputing disease status with diagnostic codes exaggerated prevalence estimates [median relative change (range), 16.6% (0.8%-74.5%)] and its association with covariates [median (range) exponentiated absolute parameter estimate difference, 1.16 (1.01-2.04)]. QBA produced invalid results 9.3% of the time and increased bias in estimates of both disease prevalence and covariate associations. BI decreased misclassification bias with increasingly accurate disease probability estimates. QBA can produce invalid results and increase misclassification bias. BI avoids invalid results and can importantly decrease misclassification bias when accurate disease probability estimates are used.

  18. Asymmetry of Peak Thicknesses between the Superior and Inferior Retinal Nerve Fiber Layers for Early Glaucoma Detection: A Simple Screening Method.

    PubMed

    Bae, Hyoung Won; Lee, Sang Yeop; Kim, Sangah; Park, Chan Keum; Lee, Kwanghyun; Kim, Chan Yun; Seong, Gong Je

    2018-01-01

    To assess whether the asymmetry in the peripapillary retinal nerve fiber layer (pRNFL) thickness between superior and inferior hemispheres on optical coherence tomography (OCT) is useful for early detection of glaucoma. The patient population consisted of Training set (a total of 60 subjects with early glaucoma and 59 normal subjects) and Validation set (30 subjects with early glaucoma and 30 normal subjects). Two kinds of ratios were employed to measure the asymmetry between the superior and inferior pRNFL thickness using OCT. One was the ratio of the superior to inferior peak thicknesses (peak pRNFL thickness ratio; PTR), and the other was the ratio of the superior to inferior average thickness (average pRNFL thickness ratio; ATR). The diagnostic abilities of the PTR and ATR were compared to the color code classification in OCT. Using the optimal cut-off values of the PTR and ATR obtained from the Training set, the two ratios were independently validated for diagnostic capability. For the Training set, the sensitivities/specificities of the PTR, ATR, quadrants color code classification, and clock-hour color code classification were 81.7%/93.2%, 71.7%/74.6%, 75.0%/93.2%, and 75.0%/79.7%, respectively. The PTR showed a better diagnostic performance for early glaucoma detection than the ATR and the clock-hour color code classification in terms of areas under the receiver operating characteristic curves (AUCs) (0.898, 0.765, and 0.773, respectively). For the Validation set, the PTR also showed the best sensitivity and AUC. The PTR is a simple method with considerable diagnostic ability for early glaucoma detection. It can, therefore, be widely used as a new screening method for early glaucoma. © Copyright: Yonsei University College of Medicine 2018

  19. Comparing Methods for Estimating Direct Costs of Adverse Drug Events.

    PubMed

    Gyllensten, Hanna; Jönsson, Anna K; Hakkarainen, Katja M; Svensson, Staffan; Hägg, Staffan; Rehnberg, Clas

    2017-12-01

    To estimate how direct health care costs resulting from adverse drug events (ADEs) and cost distribution are affected by methodological decisions regarding identification of ADEs, assigning relevant resource use to ADEs, and estimating costs for the assigned resources. ADEs were identified from medical records and diagnostic codes for a random sample of 4970 Swedish adults during a 3-month study period in 2008 and were assessed for causality. Results were compared for five cost evaluation methods, including different methods for identifying ADEs, assigning resource use to ADEs, and for estimating costs for the assigned resources (resource use method, proportion of registered cost method, unit cost method, diagnostic code method, and main diagnosis method). Different levels of causality for ADEs and ADEs' contribution to health care resource use were considered. Using the five methods, the maximum estimated overall direct health care costs resulting from ADEs ranged from Sk10,000 (Sk = Swedish krona; ~€1,500 in 2016 values) using the diagnostic code method to more than Sk3,000,000 (~€414,000) using the unit cost method in our study population. The most conservative definitions for ADEs' contribution to health care resource use and the causality of ADEs resulted in average costs per patient ranging from Sk0 using the diagnostic code method to Sk4066 (~€500) using the unit cost method. The estimated costs resulting from ADEs varied considerably depending on the methodological choices. The results indicate that costs for ADEs need to be identified through medical record review and by using detailed unit cost data. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. Validation of chronic obstructive pulmonary disease recording in the Clinical Practice Research Datalink (CPRD-GOLD)

    PubMed Central

    Quint, Jennifer K; Müllerova, Hana; DiSantostefano, Rachael L; Forbes, Harriet; Eaton, Susan; Hurst, John R; Davis, Kourtney; Smeeth, Liam

    2014-01-01

    Objectives The optimal method of identifying people with chronic obstructive pulmonary disease (COPD) from electronic primary care records is not known. We assessed the accuracy of different approaches using the Clinical Practice Research Datalink, a UK electronic health record database. Setting 951 participants registered with a CPRD practice in the UK between 1 January 2004 and 31 December 2012. Individuals were selected for ≥1 of 8 algorithms to identify people with COPD. General practitioners were sent a brief questionnaire and additional evidence to support a COPD diagnosis was requested. All information received was reviewed independently by two respiratory physicians whose opinion was taken as the gold standard. Primary outcome measure The primary measure of accuracy was the positive predictive value (PPV), the proportion of people identified by each algorithm for whom COPD was confirmed. Results 951 questionnaires were sent and 738 (78%) returned. After quality control, 696 (73.2%) patients were included in the final analysis. All four algorithms including a specific COPD diagnostic code performed well. Using a diagnostic code alone, the PPV was 86.5% (77.5–92.3%) while requiring a diagnosis plus spirometry plus specific medication; the PPV was slightly higher at 89.4% (80.7–94.5%) but reduced case numbers by 10%. Algorithms without specific diagnostic codes had low PPVs (range 12.2–44.4%). Conclusions Patients with COPD can be accurately identified from UK primary care records using specific diagnostic codes. Requiring spirometry or COPD medications only marginally improved accuracy. The high accuracy applies since the introduction of an incentivised disease register for COPD as part of Quality and Outcomes Framework in 2004. PMID:25056980

  1. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  2. Diagnostic ability of peripapillary vessel density measurements of optical coherence tomography angiography in primary open-angle and angle-closure glaucoma.

    PubMed

    Rao, Harsha L; Kadambi, Sujatha V; Weinreb, Robert N; Puttaiah, Narendra K; Pradhan, Zia S; Rao, Dhanaraj A S; Kumar, Rajesh S; Webers, Carroll A B; Shetty, Rohit

    2017-08-01

    To evaluate the diagnostic ability of peripapillary vessel density measurements on optical coherence tomography angiography (OCTA) in primary open-angle glaucoma (POAG) and primary angle-closure glaucoma (PACG), and to compare these with peripapillary retinal nerve fibre layer (RNFL) thickness measurements. In a cross-sectional study, 48 eyes of 33 healthy control subjects, 63 eyes of 39 patients with POAG and 49 eyes of 32 patients with PACG underwent OCTA (RTVue-XR, Optovue, Fremont, California, USA) and RNFL imaging with spectral domain OCT. Diagnostic abilities of vessel density and RNFL parameters were evaluated using area under receiver operating characteristic curves (AUC) and sensitivities at fixed specificities. AUCs of peripapillary vessel density ranged between 0.48 for the temporal sector and 0.88 for the inferotemporal sector in POAG. The same in PACG ranged between 0.57 and 0.86. Sensitivities at 95% specificity ranged from 13% to 70% in POAG, and from 10% to 67% in PACG. AUCs of peripapillary RNFL thickness ranged between 0.51 for the temporal sector and 0.91 for the inferonasal sector in POAG. The same in PACG ranged between 0.61 and 0.87. Sensitivities at 95% specificity ranged from 8% to 68% in POAG, and from 2% to 67% in PACG. AUCs of all peripapillary vessel density measurements were comparable (p>0.05) to the corresponding RNFL thickness measurements in both POAG and PACG. Diagnostic ability of peripapillary vessel density parameters of OCTA, especially the inferotemporal sector measurement, was good in POAG and PACG. Diagnostic abilities of vessel density measurements were comparable to RNFL measurements in both POAG and PACG. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. 77 FR 72409 - Manufacturer of Controlled Substances; Notice of Application: Siemens Healthcare Diagnostics, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... DEPARTMENT OF JUSTICE Drug Enforcement Administration Manufacturer of Controlled Substances; Notice of Application: Siemens Healthcare Diagnostics, Inc. Pursuant to Sec. 1301.33(a) Title 21 of the Code of Federal Regulations (CFR), this is notice that on November 7, 2012, Siemens Healthcare...

  4. Research and Diagnostic Applications of Monoclonal Antibodies to Coccidioides immitis.

    DTIC Science & Technology

    1985-01-01

    for Human and Animal Mycology , Georgia, May 1985. 17. COSATI CODES 18. SUBJECT TERMS (Co tinue on reverse if necessary and identify by block number...IX Congress of the International Society for Human and Animal Mycology , Atlanta GA, May 1985. ISHAM START ’IResearch and Diagnostic Applications of

  5. Auditory brainstem response (ABR) profiling tests as diagnostic support for schizophrenia and adult attention-deficit hyperactivity disorder (ADHD).

    PubMed

    Juselius Baghdassarian, Eva; Nilsson Markhed, Maria; Lindström, Eva; Nilsson, Björn M; Lewander, Tommy

    2018-06-01

    To evaluate the performances of two auditory brainstem response (ABR) profiling tests as potential biomarkers and diagnostic support for schizophrenia and adult attention-deficit hyperactivity disorder (ADHD), respectively, in an investigator-initiated blinded study design. Male and female patients with schizophrenia (n=26) and adult ADHD (n=24) meeting Diagnostic and Statistical Manual of Mental Disorders Fourth Edition (DSM IV) diagnostic criteria and healthy controls (n=58) comprised the analysis set (n=108) of the total number of study participants (n=119). Coded sets of randomized ABR recordings were analysed by an independent party blinded to clinical diagnoses before a joint code-breaking session. The ABR profiling test for schizophrenia identified schizophrenia patients versus controls with a sensitivity of 84.6% and a specificity of 93.1%. The ADHD test identified patients with adult ADHD versus controls with a sensitivity of 87.5% and a specificity of 91.4%. The ABR profiling tests discriminated schizophrenia and ADHD versus healthy controls with high sensitivity and specificity. The methods deserve to be further explored in larger clinical studies including a broad range of psychiatric disorders to determine their utility as potential diagnostic biomarkers.

  6. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  7. Electron density measurements in STPX plasmas

    NASA Astrophysics Data System (ADS)

    Clark, Jerry; Williams, R.; Titus, J. B.; Mezonlin, E. D.; Akpovo, C.; Thomas, E.

    2017-10-01

    Diagnostics have been installed to measure the electron density of Spheromak Turbulent Physics Experiment (STPX) plasmas at Florida A. & M. University. An insertable probe, provided by Auburn University, consisting of a combination of a triple-tipped Langmuir probe and a radial array consisting of three ion saturation current / floating potential rings has been installed to measure instantaneous plasma density, temperature and plasma potential. As the ramp-up of the experimental program commences, initial electron density measurements from the triple-probe show that the electron density is on the order of 1019 particles/m3. For a passive measurement, a CO2 interferometer system has been designed and installed for measuring line-averaged densities and to corroborate the Langmuir measurements. We describe the design, calibration, and performance of these diagnostic systems on large volume STPX plasmas.

  8. A study of high density bit transition requirements versus the effects on BCH error correcting coding

    NASA Technical Reports Server (NTRS)

    Ingels, F.; Schoggen, W. O.

    1981-01-01

    The various methods of high bit transition density encoding are presented, their relative performance is compared in so far as error propagation characteristics, transition properties and system constraints are concerned. A computer simulation of the system using the specific PN code recommended, is included.

  9. STELLTRANS: A Transport Analysis Suite for Stellarators

    NASA Astrophysics Data System (ADS)

    Mittelstaedt, Joseph; Lazerson, Samuel; Pablant, Novimir; Weir, Gavin; W7-X Team

    2016-10-01

    The stellarator transport code STELLTRANS allows us to better analyze the power balance in W7-X. Although profiles of temperature and density are measured experimentally, geometrical factors are needed in conjunction with these measurements to properly analyze heat flux densities in stellarators. The STELLTRANS code interfaces with VMEC to find an equilibrium flux surface configuration and with TRAVIS to determine the RF heating and current drive in the plasma. Stationary transport equations are then considered which are solved using a boundary value differential equation solver. The equations and quantities considered are averaged over flux surfaces to reduce the system to an essentially one dimensional problem. We have applied this code to data from W-7X and were able to calculate the heat flux coefficients. We will also present extensions of the code to a predictive capacity which would utilize DKES to find neoclassical transport coefficients to update the temperature and density profiles.

  10. Trends in Gastroenteritis-associated Mortality in the United States 1985-2005: Variations by ICD-9 and ICD-10 Codes

    EPA Science Inventory

    BackgroundTrends in gastroenteritis-associated mortality are changing over time with development of antibiotic resistant strains of certain pathogens, improved diagnostic methods, and changing healthcare. In 1999, ICD-10 coding was introduced for mortality records which can also ...

  11. Numerical Studies of Impurities in Fusion Plasmas

    DOE R&D Accomplishments Database

    Hulse, R. A.

    1982-09-01

    The coupled partial differential equations used to describe the behavior of impurity ions in magnetically confined controlled fusion plasmas require numerical solution for cases of practical interest. Computer codes developed for impurity modeling at the Princeton Plasma Physics Laboratory are used as examples of the types of codes employed for this purpose. These codes solve for the impurity ionization state densities and associated radiation rates using atomic physics appropriate for these low-density, high-temperature plasmas. The simpler codes solve local equations in zero spatial dimensions while more complex cases require codes which explicitly include transport of the impurity ions simultaneously with the atomic processes of ionization and recombination. Typical applications are discussed and computational results are presented for selected cases of interest.

  12. Correlates of residential wiring code used in studies of health effects of residential electromagnetic fields.

    PubMed

    Bracken, M B; Belanger, K; Hellenbrand, K; Addesso, K; Patel, S; Triche, E; Leaderer, B P

    1998-09-01

    The home wiring code is the most widely used metric for studies of residential electromagnetic field (EMF) exposure and health effects. Despite the fact that wiring code often shows stronger correlations with disease outcome than more direct EMF home assessments, little is known about potential confounders of the wiring code association. In a study carried out in southern Connecticut in 1988-1991, the authors used strict and widely used criteria to assess the wiring codes of 3,259 homes in which respondents lived. They also collected other home characteristics from the tax assessor's office, estimated traffic density around the home from state data, and interviewed each subject (2,967 mothers of reproductive age) for personal characteristics. Women who lived in very high current configuration wiring coded homes were more likely to be in manual jobs and their homes were older (built before 1949, odds ratio (OR) = 73.24, 95% confidence interval (CI) 29.53-181.65) and had lower assessed value and higher traffic densities (highest density quartile, OR = 3.99, 95% CI 1.17-13.62). Because some of these variables have themselves been associated with health outcomes, the possibility of confounding of the wiring code associations must be rigorously evaluated in future EMF research.

  13. Numerical simulation of inductive method for determining spatial distribution of critical current density

    NASA Astrophysics Data System (ADS)

    Kamitani, A.; Takayama, T.; Tanaka, A.; Ikuno, S.

    2010-11-01

    The inductive method for measuring the critical current density jC in a high-temperature superconducting (HTS) thin film has been investigated numerically. In order to simulate the method, a non-axisymmetric numerical code has been developed for analyzing the time evolution of the shielding current density. In the code, the governing equation of the shielding current density is spatially discretized with the finite element method and the resulting first-order ordinary differential system is solved by using the 5th-order Runge-Kutta method with an adaptive step-size control algorithm. By using the code, the threshold current IT is evaluated for various positions of a coil. The results of computations show that, near a film edge, the accuracy of the estimating formula for jC is remarkably degraded. Moreover, even the proportional relationship between jC and IT will be lost there. Hence, the critical current density near a film edge cannot be estimated by using the inductive method.

  14. Python Radiative Transfer Emission code (PyRaTE): non-LTE spectral lines simulations

    NASA Astrophysics Data System (ADS)

    Tritsis, A.; Yorke, H.; Tassis, K.

    2018-05-01

    We describe PyRaTE, a new, non-local thermodynamic equilibrium (non-LTE) line radiative transfer code developed specifically for post-processing astrochemical simulations. Population densities are estimated using the escape probability method. When computing the escape probability, the optical depth is calculated towards all directions with density, molecular abundance, temperature and velocity variations all taken into account. A very easy-to-use interface, capable of importing data from simulations outputs performed with all major astrophysical codes, is also developed. The code is written in PYTHON using an "embarrassingly parallel" strategy and can handle all geometries and projection angles. We benchmark the code by comparing our results with those from RADEX (van der Tak et al. 2007) and against analytical solutions and present case studies using hydrochemical simulations. The code will be released for public use.

  15. The impacts of marijuana dispensary density and neighborhood ecology on marijuana abuse and dependence.

    PubMed

    Mair, Christina; Freisthler, Bridget; Ponicki, William R; Gaidus, Andrew

    2015-09-01

    As an increasing number of states liberalize cannabis use and develop laws and local policies, it is essential to better understand the impacts of neighborhood ecology and marijuana dispensary density on marijuana use, abuse, and dependence. We investigated associations between marijuana abuse/dependence hospitalizations and community demographic and environmental conditions from 2001 to 2012 in California, as well as cross-sectional associations between local and adjacent marijuana dispensary densities and marijuana hospitalizations. We analyzed panel population data relating hospitalizations coded for marijuana abuse or dependence and assigned to residential ZIP codes in California from 2001 through 2012 (20,219 space-time units) to ZIP code demographic and ecological characteristics. Bayesian space-time misalignment models were used to account for spatial variations in geographic unit definitions over time, while also accounting for spatial autocorrelation using conditional autoregressive priors. We also analyzed cross-sectional associations between marijuana abuse/dependence and the density of dispensaries in local and spatially adjacent ZIP codes in 2012. An additional one dispensary per square mile in a ZIP code was cross-sectionally associated with a 6.8% increase in the number of marijuana hospitalizations (95% credible interval 1.033, 1.105) with a marijuana abuse/dependence code. Other local characteristics, such as the median household income and age and racial/ethnic distributions, were associated with marijuana hospitalizations in cross-sectional and panel analyses. Prevention and intervention programs for marijuana abuse and dependence may be particularly essential in areas of concentrated disadvantage. Policy makers may want to consider regulations that limit the density of dispensaries. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. A cross-sectional prevalence study of ethnically targeted and general audience outdoor obesity-related advertising.

    PubMed

    Yancey, Antronette K; Cole, Brian L; Brown, Rochelle; Williams, Jerome D; Hillier, Amy; Kline, Randolph S; Ashe, Marice; Grier, Sonya A; Backman, Desiree; McCarthy, William J

    2009-03-01

    Commercial marketing is a critical but understudied element of the sociocultural environment influencing Americans' food and beverage preferences and purchases. This marketing also likely influences the utilization of goods and services related to physical activity and sedentary behavior. A growing literature documents the targeting of racial/ethnic and income groups in commercial advertisements in magazines, on billboards, and on television that may contribute to sociodemographic disparities in obesity and chronic disease risk and protective behaviors. This article examines whether African Americans, Latinos, and people living in low-income neighborhoods are disproportionately exposed to advertisements for high-calorie, low nutrient-dense foods and beverages and for sedentary entertainment and transportation and are relatively underexposed to advertising for nutritious foods and beverages and goods and services promoting physical activities. Outdoor advertising density and content were compared in zip code areas selected to offer contrasts by area income and ethnicity in four cities: Los Angeles, Austin, New York City, and Philadelphia. Large variations were observed in the amount, type, and value of advertising in the selected zip code areas. Living in an upper-income neighborhood, regardless of its residents' predominant ethnicity, is generally protective against exposure to most types of obesity-promoting outdoor advertising (food, fast food, sugary beverages, sedentary entertainment, and transportation). The density of advertising varied by zip code area race/ethnicity, with African American zip code areas having the highest advertising densities, Latino zip code areas having slightly lower densities, and white zip code areas having the lowest densities. The potential health and economic implications of differential exposure to obesity-related advertising are substantial. Although substantive legal questions remain about the government's ability to regulate advertising, the success of limiting tobacco advertising offers lessons for reducing the marketing contribution to the obesigenicity of urban environments.

  17. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  18. Structured Low-Density Parity-Check Codes with Bandwidth Efficient Modulation

    NASA Technical Reports Server (NTRS)

    Cheng, Michael K.; Divsalar, Dariush; Duy, Stephanie

    2009-01-01

    In this work, we study the performance of structured Low-Density Parity-Check (LDPC) Codes together with bandwidth efficient modulations. We consider protograph-based LDPC codes that facilitate high-speed hardware implementations and have minimum distances that grow linearly with block sizes. We cover various higher- order modulations such as 8-PSK, 16-APSK, and 16-QAM. During demodulation, a demapper transforms the received in-phase and quadrature samples into reliability information that feeds the binary LDPC decoder. We will compare various low-complexity demappers and provide simulation results for assorted coded-modulation combinations on the additive white Gaussian noise and independent Rayleigh fading channels.

  19. Principles of Billing for Diagnostic Ultrasound in the Office and Operating Room.

    PubMed

    Grasu, Beatrice L; Wolock, Bruce S; Sedgley, Matthew D; Murphy, Michael S

    2018-05-08

    Ultrasound is becoming more prevalent as physicians gain comfort in its diagnostic and therapeutic uses. It allows for both static and dynamic evaluation of conditions and assists in therapeutic injections of joints and tendons. Proper technique is necessary for successful use of this modality. Appropriate coding for physician reimbursement is required. We discuss common wrist and hand pathology for which ultrasound may be useful as an adjunct to diagnosis and treatment and provide an overview of technique and reimbursement codes when using ultrasound in a variety of situations. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  20. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    NASA Astrophysics Data System (ADS)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  1. Directed high-power THz radiation from transverse laser wakefield excited in an electron density filament

    NASA Astrophysics Data System (ADS)

    Kalmykov, Serge; Englesbe, Alexander; Elle, Jennifer; Domonkos, Matthew; Schmitt-Sody, Andreas

    2017-10-01

    A tightly focused femtosecond, weakly relativistic laser pulse partially ionizes the ambient gas, creating a string (a ``filament'') of electron density, locally reducing the nonlinear index and compensating for the self-focusing effect caused by bound electrons. While maintaining the filament over many Rayleigh lengths, the pulse drives inside it a three-dimensional (3D) wave of charge separation - the plasma wake. If the pulse waist size is much smaller than the Langmuir wavelength, electron current in the wake is mostly transverse. Electrons, driven by the wake across the sharp radial boundary of the filament, lose coherence within 2-3 periods of wakefield oscillations, and the wake decays. The laser pulse is thus accompanied by a short-lived, almost aperiodic electron current coupled to the sharp index gradient. The comprehensive 3D hydrodynamic model shows that this structure emits a broad-band THz radiation, with the highest power emitted in the near-forward direction. The THz radiation pattern contains information on wake currents surrounding the laser pulse, thus serving as an all-optical diagnostic tool. The results are tested in cylindrical and full 3D PIC simulations using codes WAKE and EPOCH.

  2. Tracking Energetics of a CME Core in the Low Solar Corona

    NASA Astrophysics Data System (ADS)

    Kocher, M.; Landi, E.; Lepri, S. T.

    2017-12-01

    In order to understand the processes that generate CMEs, and develop the ability to predict their evolution and geoeffectiveness, it is very important to determine how the plasma properties within coronal mass ejections (CME) evolve through their journey from the low corona through the solar environment. This study uses a combination of remote-sensing and in-situ observations of a filament eruption (that later formed the core of the CME) that left the Sun on August 4th, 2011 - shortly after an M-class flare. Separate absorption and emission diagnostic techniques are utilized to compute time-evolution estimates of the density and temperature of multiple plasma parcels within the filament using SDO/AIA EUV images. Twin STEREO spacecraft observations are used to estimate the height, speed, and acceleration of the CME at corresponding times. These observation-based densities, temperatures, and speeds allowed us to use the Michigan Ionization Code to compute the ionization history of this CME in the low solar corona. Along with the thermal and kinetic properties of this CME, we present a comparison with existing CME evolution models and draw inferences on its heating and acceleration.

  3. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  4. Experimental studies of toroidal correlations of plasma density fluctuations along the magnetic field lines in the T-10 tokamak and first results of numerical modeling

    NASA Astrophysics Data System (ADS)

    Buldakov, M. A.; Vershkov, V. A.; Isaev, M. Yu; Shelukhin, D. A.

    2017-10-01

    The antenna system of reflectometry diagnostics at the T-10 tokamak allows to study long-range toroidal correlations of plasma density fluctuations along the magnetic field lines. The antenna systems are installed in two poloidal cross-sections of the vacuum chamber separated by a 90° angle in the toroidal direction. The experiments, which were conducted at the low field side, showed that the high level of toroidal correlations is observed only for quasi-coherent fluctuations. However, broadband and stochastic low frequency fluctuations are not correlated. Numerical modeling of the plasma turbulence structure in the T-10 tokamak was conducted to interpret the experimental results and take into account non-locality of reflectometry measurements. In the model used, it was assumed that the magnitudes of density fluctuations are constant along the magnetic field lines. The 2D full-wave Tamic-RTH code was used to model the reflectometry signals. High level of correlations for quasi-coherent fluctuations was obtained during the modeling, which agrees with the experimental observations. However, the performed modeling also predicts high level of correlations for broadband fluctuations, which contradicts the experimental data. The modeling showed that the effective reflection radius, from which the information on quasi-coherent plasma turbulence is obtained, is shifted outwards from the reflection radius by approximately 7 mm.

  5. Mean flows and blob velocities in scrape-off layer (SOLT) simulations of an L-mode discharge on Alcator C-Mod

    DOE PAGES

    Russell, D. A.; Myra, J. R.; D'Ippolito, D. A.; ...

    2016-06-10

    Two-dimensional scrape-off layer turbulence (SOLT) code simulations are compared with an L-mode discharge on the Alcator C-Mod tokamak [M. Greenwald, et al., Phys. Plasmas 21, 110501 (2014)]. Density and temperature profiles for the simulations were obtained by smoothly fitting Thomson scattering and mirror Langmuir probe (MLP) data from the shot. Simulations differing in turbulence intensity were obtained by varying a dissipation parameter. Mean flow profiles and density fluctuation amplitudes are consistent with those measured by MLP in the experiment and with a Fourier space diagnostic designed to measure poloidal phase velocity. Blob velocities in the simulations were determined from themore » correlation function for density fluctuations, as in the analysis of gas-puff-imaging (GPI) blobs in the experiment. In the simulations, it was found that larger blobs moved poloidally with the ExB flow velocity, v E , in the near-SOL, while smaller fluctuations moved with the group velocity of the dominant linear (interchange) mode, v E + 1/2 v di, where v di is the ion diamagnetic drift velocity. Comparisons are made with the measured GPI correlation velocity for the discharge. The saturation mechanisms operative in the simulation of the discharge are also discussed. In conclusion, it is found that neither sheared flow nor pressure gradient modification can be excluded as saturation mechanisms.« less

  6. Performance of JT-60SA divertor Thomson scattering diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kajita, Shin, E-mail: kajita.shin@nagoya-u.jp; Hatae, Takaki; Tojo, Hiroshi

    2015-08-15

    For the satellite tokamak JT-60 Super Advanced (JT-60SA), a divertor Thomson scattering measurement system is planning to be installed. In this study, we improved the design of the collection optics based on the previous one, in which it was found that the solid angle of the collection optics became very small, mainly because of poor accessibility to the measurement region. By improvement, the solid angle was increased by up to approximately five times. To accurately assess the measurement performance, background noise was assessed using the plasma parameters in two typical discharges in JT-60SA calculated from the SONIC code. Moreover, themore » influence of the reflection of bremsstrahlung radiation by the wall is simulated by using a ray tracing simulation. The errors in the temperature and the density are assessed based on the simulation results for three typical field of views.« less

  7. Performance of JT-60SA divertor Thomson scattering diagnostics.

    PubMed

    Kajita, Shin; Hatae, Takaki; Tojo, Hiroshi; Enokuchi, Akito; Hamano, Takashi; Shimizu, Katsuhiro; Kawashima, Hisato

    2015-08-01

    For the satellite tokamak JT-60 Super Advanced (JT-60SA), a divertor Thomson scattering measurement system is planning to be installed. In this study, we improved the design of the collection optics based on the previous one, in which it was found that the solid angle of the collection optics became very small, mainly because of poor accessibility to the measurement region. By improvement, the solid angle was increased by up to approximately five times. To accurately assess the measurement performance, background noise was assessed using the plasma parameters in two typical discharges in JT-60SA calculated from the SONIC code. Moreover, the influence of the reflection of bremsstrahlung radiation by the wall is simulated by using a ray tracing simulation. The errors in the temperature and the density are assessed based on the simulation results for three typical field of views.

  8. Identification of novel diagnostic biomarkers for thyroid carcinoma

    PubMed Central

    Wang, Xiliang; Zhang, Qing; Cai, Zhiming; Dai, Yifan; Mou, Lisha

    2017-01-01

    Thyroid carcinoma (THCA) is the most universal endocrine malignancy worldwide. Unfortunately, a limited number of large-scale analyses have been performed to identify biomarkers for THCA. Here, we conducted a meta-analysis using 505 THCA patients and 59 normal controls from The Cancer Genome Atlas. After identifying differentially expressed long non-coding RNA (lncRNA) and protein coding genes (PCG), we found vast difference in various lncRNA-PCG co-expressed pairs in THCA. A dysregulation network with scale-free topology was constructed. Four molecules (LA16c-380H5.2, RP11-203J24.8, MLF1 and SDC4) could potentially serve as diagnostic biomarkers of THCA with high sensitivity and specificity. We further represent a diagnostic panel with expression cutoff values. Our results demonstrate the potential application of those four molecules as novel independent biomarkers for THCA diagnosis. PMID:29340074

  9. A comparison of cosmological hydrodynamic codes

    NASA Technical Reports Server (NTRS)

    Kang, Hyesung; Ostriker, Jeremiah P.; Cen, Renyue; Ryu, Dongsu; Hernquist, Lars; Evrard, August E.; Bryan, Greg L.; Norman, Michael L.

    1994-01-01

    We present a detailed comparison of the simulation results of various hydrodynamic codes. Starting with identical initial conditions based on the cold dark matter scenario for the growth of structure, with parameters h = 0.5 Omega = Omega(sub b) = 1, and sigma(sub 8) = 1, we integrate from redshift z = 20 to z = O to determine the physical state within a representative volume of size L(exp 3) where L = 64 h(exp -1) Mpc. Five indenpendent codes are compared: three of them Eulerian mesh-based and two variants of the smooth particle hydrodynamics 'SPH' Lagrangian approach. The Eulerian codes were run at N(exp 3) = (32(exp 3), 64(exp 3), 128(exp 3), and 256(exp 3)) cells, the SPH codes at N(exp 3) = 32(exp 3) and 64(exp 3) particles. Results were then rebinned to a 16(exp 3) grid with the exception that the rebinned data should converge, by all techniques, to a common and correct result as N approaches infinity. We find that global averages of various physical quantities do, as expected, tend to converge in the rebinned model, but that uncertainites in even primitive quantities such as (T), (rho(exp 2))(exp 1/2) persists at the 3%-17% level achieve comparable and satisfactory accuracy for comparable computer time in their treatment of the high-density, high-temeprature regions as measured in the rebinned data; the variance among the five codes (at highest resolution) for the mean temperature (as weighted by rho(exp 2) is only 4.5%. Examined at high resolution we suspect that the density resolution is better in the SPH codes and the thermal accuracy in low-density regions better in the Eulerian codes. In the low-density, low-temperature regions the SPH codes have poor accuracy due to statiscal effects, and the Jameson code gives the temperatures which are too high, due to overuse of artificial viscosity in these high Mach number regions. Overall the comparison allows us to better estimate errors; it points to ways of improving this current generation ofhydrodynamic codes and of suiting their use to problems which exploit their best individual features.

  10. Axial deformed solution of the Skyrme-Hartree-Fock-Bogolyubov equations using the transformed harmonic oscillator Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez, R. Navarro; Schunck, N.; Lasseri, R.

    2017-03-09

    HFBTHO is a physics computer code that is used to model the structure of the nucleus. It is an implementation of the nuclear energy Density Functional Theory (DFT), where the energy of the nucleus is obtained by integration over space of some phenomenological energy density, which is itself a functional of the neutron and proton densities. In HFBTHO, the energy density derives either from the zero-range Dkyrme or the finite-range Gogny effective two-body interaction between nucleons. Nuclear superfluidity is treated at the Hartree-Fock-Bogoliubov (HFB) approximation, and axial-symmetry of the nuclear shape is assumed. This version is the 3rd release ofmore » the program; the two previous versions were published in Computer Physics Communications [1,2]. The previous version was released at LLNL under GPL 3 Open Source License and was given release code LLNL-CODE-573953.« less

  11. LDPC product coding scheme with extrinsic information for bit patterned media recoding

    NASA Astrophysics Data System (ADS)

    Jeong, Seongkwon; Lee, Jaejin

    2017-05-01

    Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.

  12. Electron density increases due to Lightning activity as deduced from LWPC code and VLF signal perturbations.

    NASA Astrophysics Data System (ADS)

    Samir, Nait Amor; Bouderba, Yasmina

    VLF signal perturbations in association with thunderstorm activity appear as changes in the signal amplitude and phase. Several papers reported on the characteristics of thus perturbations and their connection to the lightning strokes amplitude and polarity. In this contribution, we quantified the electrons density increases due to lightning activity by the use of the LWPC code and VLF signal perturbations parameters. The method is similar to what people did in studying the solar eruptions effect. the results showed that the reference height (h') decreased to lower altitudes (between 70 and 80 km). From the LWPC code results the maximum of the electron density was then deduced. Therefore, a numerical simulation of the atmospheric species times dependences was performed to study the recovery times of the electrons density at different heights. The results showed that the recovery time last for several minutes and explain the observation of long recovery Early signal perturbations.

  13. Importance of Proper Utilization of International Classification of Diseases 10th Revision and Clinical Documentation in Modern Payment Models.

    PubMed

    Nichols, Joseph C; Osmani, Feroz A; Sayeed, Yousuf

    2016-05-01

    Health care payment models are changing rapidly, and the measurement of outcomes and costs is increasing. With the implementation of International Classification of Diseases 10th revision (ICD-10) codes, providers now have the ability to introduce a precise array of diagnoses for their patients. More specific diagnostic codes do not eliminate the potential for vague application, as was seen with the utility of ICD-9. Complete, accurate, and consistent data that reflect the risk, severity, and complexity of care are becoming critically important in this new environment. Orthopedic specialty organizations must be actively involved in influencing the definition of value and risk in the patient population. Now is the time to use the ICD-10 diagnostic codes to improve the management of patient conditions in data. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Laser Blow-Off Impurity Injection Experiments at the HSX Stellarator

    NASA Astrophysics Data System (ADS)

    Castillo, J. F.; Bader, A.; Likin, K. M.; Anderson, D. T.; Anderson, F. S. B.; Kumar, S. T. A.; Talmadge, J. N.

    2017-10-01

    Results from the HSX laser blow-off experiment are presented and compared to a synthetic diagnostic implemented in the STRAHL impurity transport modeling code in order to measure the impurity transport diffusivity and convective velocity. A laser blow-off impurity injection system is used to rapidly deposit a small, controlled quantity of aluminum into the confinement volume. Five AXUV photodiode arrays are used to take time-resolved measurements of the impurity radiation. The spatially one-dimensional impurity transport code STRAHL is used to calculate a time-dependent plasma emissivity profile. Modeled intensity signals calculated from a synthetic diagnostic code provide direct comparison between plasma simulation and experimental results. An optimization algorithm with impurity transport coefficients acting as free parameters is used to fit the model to experimental data. This work is supported by US DOE Grant DE-FG02-93ER54222.

  15. Diagnostic Suite for HyperV Coaxial Plasma Gun Development for the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Case, Andrew; Brockington, Sam; Witherspoon, F. Douglas

    2015-11-01

    We present the diagnostic suite to be used during development of the coaxial guns HyperV will deliver to LANL in support of the ARPA-E Accelerating Low-Cost Plasma Heating And Assembly (ALPHA) program. For plasma jet diagnostics this includes fast photodiodes for velocimetry, a ballistic pendulum for measuring total plasmoid momentum, interferometry for line integrated plasma density, deflectometry for line integrated perpendicular density gradient measurements, and spectroscopy, both time resolved high resolution spectroscopy using a novel detector developed by HyperV and time integrated survey spectroscopy, for measurements of velocity and temperature as well as impurities. In addition, we plan to use fast pressure probes for stagnation pressure, a Faraday cup for density, fast imaging for plume geometry and time integrated imaging for overall light emission. A novel low resolution long record length camera developed by HyperV will also be used for plume diagnostics. For diagnostics of gun operation, we will use Rogowski coils to measure current, voltage dividers for voltages, B-dot probes for magnetic field, and time resolved fast photodiodes to measure plasmoid velocity inside the accelerator. This work supported by the ARPA-E ALPHA program.

  16. Qualification and implementation of line ratio spectroscopy on helium as plasma edge diagnostic at ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Griener, M.; Muñoz Burgos, J. M.; Cavedon, M.; Birkenmeier, G.; Dux, R.; Kurzan, B.; Schmitz, O.; Sieglin, B.; Stroth, U.; Viezzer, E.; Wolfrum, E.; the ASDEX Upgrade Team

    2018-02-01

    A new thermal helium beam diagnostic has been implemented as plasma edge diagnostic at the ASDEX Upgrade (AUG) tokamak. The helium beam is built to measure the electron density n e and temperature T e simultaneously with high spatial and temporal resolution in order to investigate steady-state as well as fast transport processes in the plasma edge region. For the thermal helium beam emission line ratio spectroscopy, neutral helium is locally injected into the plasma by a piezo valve. This enabled the measurement of the line resolved emission intensities of seven He I lines for different plasma scenarios in AUG. The different line ratios can be used together with a collisional-radiative model (CRM) to reconstruct the underlying electron temperature and density. Ratios from the same spin species are used for the electron density reconstruction, whereas spin mixed ratios are sensitive to electron temperature changes. The different line ratios as well as different CRMs are tested for their suitability for diagnostic applications. Furthermore their consistency in calculating identical parameters is validated and the resulting profiles are compared to other available diagnostics at AUG.

  17. The Thomson scattering diagnostic at Wendelstein 7-X and its performance in the first operation phase

    NASA Astrophysics Data System (ADS)

    Bozhenkov, S. A.; Beurskens, M.; Dal Molin, A.; Fuchert, G.; Pasch, E.; Stoneking, M. R.; Hirsch, M.; Höfel, U.; Knauer, J.; Svensson, J.; Trimino Mora, H.; Wolf, R. C.

    2017-10-01

    The optimized stellarator Wendelstein 7-X started operation in December 2015 with a 10 week limiter campaign. Divertor experiments will begin in the second half of 2017. The W7-X Thomson scattering system is an essential diagnostic for electron density and temperature profiles. In this paper the Thomson scattering diagnostic is described in detail, including its design, calibration, data evaluation and first experimental results. Plans for further development are also presented. The W7-X Thomson system is a Nd:YAG setup with up to five lasers, two sets of light collection lenses viewing the entire plasma cross-section, fiber bundles and filter based polychromators. To reduce hardware costs, two or three scattering volumes are measured with a single polychromator. The relative spectral calibration is carried out with the aid of a broadband supercontinuum light source. The absolute calibration is performed by observing Raman scattering in nitrogen. The electron temperatures and densities are recovered by Bayesian modelling. In the first campaign, the diagnostic was equipped for 10 scattering volumes. It provided temperature profiles comparable to those measured using an electron cyclotron emission diagnostic and line integrated densities within 10% of those from a dispersion interferometer.

  18. Magnetohydrodynamic modelling of exploding foil initiators

    NASA Astrophysics Data System (ADS)

    Neal, William

    2015-06-01

    Magnetohydrodynamic (MHD) codes are currently being developed, and used, to predict the behaviour of electrically-driven flyer-plates. These codes are of particular interest to the design of exploding foil initiator (EFI) detonators but there is a distinct lack of comparison with high-fidelity experimental data. This study aims to compare a MHD code with a collection of temporally and spatially resolved diagnostics including PDV, dual-axis imaging and streak imaging. The results show the code's excellent representation of the flyer-plate launch and highlight features within the experiment that the model fails to capture.

  19. High density harp or wire scanner for particle beam diagnostics

    DOEpatents

    Fritsche, C.T.; Krogh, M.L.

    1996-05-21

    Disclosed is a diagnostic detector head harp used to detect and characterize high energy particle beams using an array of closely spaced detector wires, typically carbon wires, spaced less than 0.1 cm (0.040 inch) connected to a hybrid microcircuit formed on a ceramic substrate. A method to fabricate harps to obtain carbon wire spacing and density not previously available utilizing hybrid microcircuit technology. The hybrid microcircuit disposed on the ceramic substrate connects electrically between the detector wires and diagnostic equipment which analyzes pulses generated in the detector wires by the high energy particle beams. 6 figs.

  20. Error Correction using Quantum Quasi-Cyclic Low-Density Parity-Check(LDPC) Codes

    NASA Astrophysics Data System (ADS)

    Jing, Lin; Brun, Todd; Quantum Research Team

    Quasi-cyclic LDPC codes can approach the Shannon capacity and have efficient decoders. Manabu Hagiwara et al., 2007 presented a method to calculate parity check matrices with high girth. Two distinct, orthogonal matrices Hc and Hd are used. Using submatrices obtained from Hc and Hd by deleting rows, we can alter the code rate. The submatrix of Hc is used to correct Pauli X errors, and the submatrix of Hd to correct Pauli Z errors. We simulated this system for depolarizing noise on USC's High Performance Computing Cluster, and obtained the block error rate (BER) as a function of the error weight and code rate. From the rates of uncorrectable errors under different error weights we can extrapolate the BER to any small error probability. Our results show that this code family can perform reasonably well even at high code rates, thus considerably reducing the overhead compared to concatenated and surface codes. This makes these codes promising as storage blocks in fault-tolerant quantum computation. Error Correction using Quantum Quasi-Cyclic Low-Density Parity-Check(LDPC) Codes.

  1. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  2. Efficient Signal, Code, and Receiver Designs for MIMO Communication Systems

    DTIC Science & Technology

    2003-06-01

    167 5-31 Concatenation of a tilted-QAM inner code with an LDPC outer code with a two component iterative soft-decision decoder. . . . . . . . . 168 5...for AWGN channels has long been studied. There are well-known soft-decision codes like the turbo codes and LDPC codes that can approach capacity to...bits) low density parity check ( LDPC ) code 1. 2. The coded bits are randomly interleaved so that bits nearby go through different sub-channels, and are

  3. A code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check codes

    NASA Astrophysics Data System (ADS)

    Bai, Cheng-lin; Cheng, Zhi-hui

    2016-09-01

    In order to further improve the carrier synchronization estimation range and accuracy at low signal-to-noise ratio ( SNR), this paper proposes a code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check (NB-LDPC) codes to study the polarization-division-multiplexing coherent optical orthogonal frequency division multiplexing (PDM-CO-OFDM) system performance in the cases of quadrature phase shift keying (QPSK) and 16 quadrature amplitude modulation (16-QAM) modes. The simulation results indicate that this algorithm can enlarge frequency and phase offset estimation ranges and enhance accuracy of the system greatly, and the bit error rate ( BER) performance of the system is improved effectively compared with that of the system employing traditional NB-LDPC code-aided carrier synchronization algorithm.

  4. Nebular and auroral emission lines of [Cl iii] in the optical spectra of planetary nebulae

    PubMed Central

    Keenan, Francis P.; Aller, Lawrence H.; Ramsbottom, Catherine A.; Bell, Kenneth L.; Crawford, Fergal L.; Hyung, Siek

    2000-01-01

    Electron impact excitation rates in Cl III, recently determined with the R-matrix code, are used to calculate electron temperature (Te) and density (Ne) emission line ratios involving both the nebular (5517.7, 5537.9 Å) and auroral (8433.9, 8480.9, 8500.0 Å) transitions. A comparison of these results with observational data for a sample of planetary nebulae, obtained with the Hamilton Echelle Spectrograph on the 3-m Shane Telescope, reveals that the R1 = I(5518 Å)/I(5538 Å) intensity ratio provides estimates of Ne in excellent agreement with the values derived from other line ratios in the echelle spectra. This agreement indicates that R1 is a reliable density diagnostic for planetary nebulae, and it also provides observational support for the accuracy of the atomic data adopted in the line ratio calculations. However the [Cl iii] 8433.9 Å line is found to be frequently blended with a weak telluric emission feature, although in those instances when the [Cl iii] intensity may be reliably measured, it provides accurate determinations of Te when ratioed against the sum of the 5518 and 5538 Å line fluxes. Similarly, the 8500.0 Å line, previously believed to be free of contamination by the Earth's atmosphere, is also shown to be generally blended with a weak telluric emission feature. The [Cl iii] transition at 8480.9 Å is found to be blended with the He i 8480.7 Å line, except in planetary nebulae that show a relatively weak He i spectrum, where it also provides reliable estimates of Te when ratioed against the nebular lines. Finally, the diagnostic potential of the near-UV [Cl iii] lines at 3344 and 3354 Å is briefly discussed. PMID:10759562

  5. Why are alcohol-related emergency department presentations under-detected? An exploratory study using nursing triage text.

    PubMed

    Indig, Devon; Copeland, Jan; Conigrave, K M; Rotenko, Irene

    2008-11-01

    This study examined two methods of detecting alcohol-related emergency department (ED) presentations, provisional medical diagnosis and nursing triage text, and compared patient and service delivery characteristics to determine which patients are being missed from formal diagnosis in order to explore why alcohol-related ED presentations are under-detected. Data were reviewed for all ED presentations from 2004 to 2006 (n = 118,881) for a major teaching hospital in Sydney, Australia. Each record included two nursing triage free-text fields, which were searched for over 60 alcohol-related terms and coded for a range of issues. Adjusted odds ratios were used to compare diagnostically coded alcohol-related presentations to those detected using triage text. Approximately 4.5% of ED presentations were identified as alcohol-related, with 24% of these identified through diagnostic codes and the remainder identified by triage text. Diagnostic coding was more likely if the patient arrived by ambulance [odds ratio (OR) = 2.35] or showed signs of aggression (OR = 1.86). Failure to code alcohol-related issues was more than three times (OR = 3.23) more likely for patients with injuries. Alcohol-related presentations place a high demand on ED staff and less than one-quarter have an alcohol-related diagnosis recorded by their treating doctor. In order for routine ED data to be more effective for detecting alcohol-related ED presentations, it is recommended that additional resources such as an alcohol health worker be employed in Australian hospitals. These workers can educate and support ED staff to identify more clearly and record the clinical signs of alcohol and directly provide brief interventions.

  6. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.; Heidbrink, W. W.; Stagner, L.

    2016-02-01

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a ‘beam-in-a-box’ model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components produce first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.

  7. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.

    2016-01-12

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a 'beam-in-a-box' model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components producemore » first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.« less

  8. Predictive values of diagnostic codes for identifying serious hypocalcemia and dermatologic adverse events among women with postmenopausal osteoporosis in a commercial health plan database.

    PubMed

    Wang, Florence T; Xue, Fei; Ding, Yan; Ng, Eva; Critchlow, Cathy W; Dore, David D

    2018-04-10

    Post-marketing safety studies of medicines often rely on administrative claims databases to identify adverse outcomes following drug exposure. Valid ascertainment of outcomes is essential for accurate results. We aim to quantify the validity of diagnostic codes for serious hypocalcemia and dermatologic adverse events from insurance claims data among women with postmenopausal osteoporosis (PMO). We identified potential cases of serious hypocalcemia and dermatologic events through ICD-9 diagnosis codes among women with PMO within claims from a large US healthcare insurer (June 2005-May 2010). A physician adjudicated potential hypocalcemic and dermatologic events identified from the primary position on emergency department (ED) or inpatient claims through medical record review. Positive predictive values (PPVs) and 95% confidence intervals (CIs) quantified the fraction of potential cases that were confirmed. Among 165,729 patients with PMO, medical charts were obtained for 40 of 55 (73%) potential hypocalcemia cases; 16 were confirmed (PPV 40%, 95% CI 25-57%). The PPV was higher for ED than inpatient claims (82 vs. 24%). Among 265 potential dermatologic events (primarily urticaria or rash), we obtained 184 (69%) charts and confirmed 128 (PPV 70%, 95% CI 62-76%). The PPV was higher for ED than inpatient claims (77 vs. 39%). Diagnostic codes for hypocalcemia and dermatologic events may be sufficient to identify events giving rise to emergency care, but are less accurate for identifying events within hospitalizations.

  9. Rayleigh Scattering Diagnostic for Measurement of Velocity and Density Fluctuation Spectra

    NASA Technical Reports Server (NTRS)

    Seasholtz, Richard G.; Panda, Jayanta; Elam, Kristie A.

    2002-01-01

    A new molecular Rayleigh scattering based flow diagnostic is used for the first time to measure the power spectrum of gas density and radial velocity component in the plumes of high speed jets. The technique is based on analyzing the Rayleigh scattered light with a Fabry-Perot interferometer used in the static, imaging mode. The PC based data acquisition system is capable of simultaneous sampling of velocity and density at rates to 100 kHz and data record lengths to 10 million. Velocity and density power spectra and velocity-density cross spectra are presented for a subsonic jet, an underexpanded screeching jet, and for Mach 1.4 and Mach 1.8 supersonic jets. Software and hardware interfaces were developed to allow computer control of all aspects of the experiment and data acquisition.

  10. Recommendations for the standardization and interpretation of the electrocardiogram. Part II: Electrocardiography diagnostic statement list. A scientific statement from the American Heart Association Electrocardiography and Arrhythmias Committee, Council on Clinical Cardiology; the American College of Cardiology Foundation; and the Heart Rhythm Society.

    PubMed

    Mason, Jay W; Hancock, E William; Gettes, Leonard S

    2007-03-01

    This statement provides a concise list of diagnostic terms for ECG interpretation that can be shared by students, teachers, and readers of electrocardiography. This effort was motivated by the existence of multiple automated diagnostic code sets containing imprecise and overlapping terms. An intended outcome of this statement list is greater uniformity of ECG diagnosis and a resultant improvement in patient care. The lexicon includes primary diagnostic statements, secondary diagnostic statements, modifiers, and statements for the comparison of ECGs. This diagnostic lexicon should be reviewed and updated periodically.

  11. Magnetic Levitation Coupled with Portable Imaging and Analysis for Disease Diagnostics.

    PubMed

    Knowlton, Stephanie M; Yenilmez, Bekir; Amin, Reza; Tasoglu, Savas

    2017-02-19

    Currently, many clinical diagnostic procedures are complex, costly, inefficient, and inaccessible to a large population in the world. The requirements for specialized equipment and trained personnel require that many diagnostic tests be performed at remote, centralized clinical laboratories. Magnetic levitation is a simple yet powerful technique and can be applied to levitate cells, which are suspended in a paramagnetic solution and placed in a magnetic field, at a position determined by equilibrium between a magnetic force and a buoyancy force. Here, we present a versatile platform technology designed for point-of-care diagnostics which uses magnetic levitation coupled to microscopic imaging and automated analysis to determine the density distribution of a patient's cells as a useful diagnostic indicator. We present two platforms operating on this principle: (i) a smartphone-compatible version of the technology, where the built-in smartphone camera is used to image cells in the magnetic field and a smartphone application processes the images and to measures the density distribution of the cells and (ii) a self-contained version where a camera board is used to capture images and an embedded processing unit with attached thin-film-transistor (TFT) screen measures and displays the results. Demonstrated applications include: (i) measuring the altered distribution of a cell population with a disease phenotype compared to a healthy phenotype, which is applied to sickle cell disease diagnosis, and (ii) separation of different cell types based on their characteristic densities, which is applied to separate white blood cells from red blood cells for white blood cell cytometry. These applications, as well as future extensions of the essential density-based measurements enabled by this portable, user-friendly platform technology, will significantly enhance disease diagnostic capabilities at the point of care.

  12. Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.

    PubMed

    Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic

    2014-06-01

    In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.

  13. Design and implementation of a channel decoder with LDPC code

    NASA Astrophysics Data System (ADS)

    Hu, Diqing; Wang, Peng; Wang, Jianzong; Li, Tianquan

    2008-12-01

    Because Toshiba quit the competition, there is only one standard of blue-ray disc: BLU-RAY DISC, which satisfies the demands of high-density video programs. But almost all the patents are gotten by big companies such as Sony, Philips. As a result we must pay much for these patents when our productions use BD. As our own high-density optical disk storage system, Next-Generation Versatile Disc(NVD) which proposes a new data format and error correction code with independent intellectual property rights and high cost performance owns higher coding efficiency than DVD and 12GB which could meet the demands of playing the high-density video programs. In this paper, we develop Low-Density Parity-Check Codes (LDPC): a new channel encoding process and application scheme using Q-matrix based on LDPC encoding has application in NVD's channel decoder. And combined with the embedded system portable feature of SOPC system, we have completed all the decoding modules by FPGA. In the NVD experiment environment, tests are done. Though there are collisions between LDPC and Run-Length-Limited modulation codes (RLL) which are used in optical storage system frequently, the system is provided as a suitable solution. At the same time, it overcomes the defects of the instability and inextensibility, which occurred in the former decoding system of NVD--it was implemented by hardware.

  14. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  15. Optimising Use of Electronic Health Records to Describe the Presentation of Rheumatoid Arthritis in Primary Care: A Strategy for Developing Code Lists

    PubMed Central

    Nicholson, Amanda; Ford, Elizabeth; Davies, Kevin A.; Smith, Helen E.; Rait, Greta; Tate, A. Rosemary; Petersen, Irene; Cassell, Jackie

    2013-01-01

    Background Research using electronic health records (EHRs) relies heavily on coded clinical data. Due to variation in coding practices, it can be difficult to aggregate the codes for a condition in order to define cases. This paper describes a methodology to develop ‘indicator markers’ found in patients with early rheumatoid arthritis (RA); these are a broader range of codes which may allow a probabilistic case definition to use in cases where no diagnostic code is yet recorded. Methods We examined EHRs of 5,843 patients in the General Practice Research Database, aged ≥30y, with a first coded diagnosis of RA between 2005 and 2008. Lists of indicator markers for RA were developed initially by panels of clinicians drawing up code-lists and then modified based on scrutiny of available data. The prevalence of indicator markers, and their temporal relationship to RA codes, was examined in patients from 3y before to 14d after recorded RA diagnosis. Findings Indicator markers were common throughout EHRs of RA patients, with 83.5% having 2 or more markers. 34% of patients received a disease-specific prescription before RA was coded; 42% had a referral to rheumatology, and 63% had a test for rheumatoid factor. 65% had at least one joint symptom or sign recorded and in 44% this was at least 6-months before recorded RA diagnosis. Conclusion Indicator markers of RA may be valuable for case definition in cases which do not yet have a diagnostic code. The clinical diagnosis of RA is likely to occur some months before it is coded, shown by markers frequently occurring ≥6 months before recorded diagnosis. It is difficult to differentiate delay in diagnosis from delay in recording. Information concealed in free text may be required for the accurate identification of patients and to assess the quality of care in general practice. PMID:23451024

  16. Cosmology in one dimension: Vlasov dynamics.

    PubMed

    Manfredi, Giovanni; Rouet, Jean-Louis; Miller, Bruce; Shiozawa, Yui

    2016-04-01

    Numerical simulations of self-gravitating systems are generally based on N-body codes, which solve the equations of motion of a large number of interacting particles. This approach suffers from poor statistical sampling in regions of low density. In contrast, Vlasov codes, by meshing the entire phase space, can reach higher accuracy irrespective of the density. Here, we perform one-dimensional Vlasov simulations of a long-standing cosmological problem, namely, the fractal properties of an expanding Einstein-de Sitter universe in Newtonian gravity. The N-body results are confirmed for high-density regions and extended to regions of low matter density, where the N-body approach usually fails.

  17. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model--Documentation of the SEAWAT-2000 Version with the Variable-Density Flow Process (VDF) and the Integrated MT3DMS Transport Process (IMT)

    USGS Publications Warehouse

    Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing

    2003-01-01

    SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.

  18. Modeling Laboratory Astrophysics Experiments in the High-Energy-Density Regime Using the CRASH Radiation-Hydrodynamics Model

    NASA Astrophysics Data System (ADS)

    Grosskopf, M. J.; Drake, R. P.; Trantham, M. R.; Kuranz, C. C.; Keiter, P. A.; Rutter, E. M.; Sweeney, R. M.; Malamud, G.

    2012-10-01

    The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density physics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. CRASH model results have shown good agreement with a experimental results from a variety of applications, including: radiative shock, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL), collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  19. Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis

    NASA Technical Reports Server (NTRS)

    Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.

    2012-01-01

    Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.

  20. Accounting for overdispersion when determining primary care outliers for the identification of chronic kidney disease: learning from the National Chronic Kidney Disease Audit.

    PubMed

    Kim, Lois G; Caplin, Ben; Cleary, Faye; Hull, Sally A; Griffith, Kathryn; Wheeler, David C; Nitsch, Dorothea

    2017-04-01

    Early diagnosis of chronic kidney disease (CKD) facilitates best management in primary care. Testing coverage of those at risk and translation into subsequent diagnostic coding will impact on observed CKD prevalence. Using initial data from 915 general practitioner (GP) practices taking part in a UK national audit, we seek to apply appropriate methods to identify outlying practices in terms of CKD stages 3-5 prevalence and diagnostic coding. We estimate expected numbers of CKD stages 3-5 cases in each practice, adjusted for key practice characteristics, and further inflate the control limits to account for overdispersion related to unobserved factors (including unobserved risk factors for CKD, and between-practice differences in coding and testing). GP practice prevalence of coded CKD stages 3-5 ranges from 0.04 to 7.8%. Practices differ considerably in coding of CKD in individuals where CKD is indicated following testing (ranging from 0 to 97% of those with and glomerular filtration rate  <60 mL/min/1.73 m 2 ). After adjusting for risk factors and overdispersion, the number of  'extreme' practices is reduced from 29 to 2.6% for the low-coded CKD prevalence outcome, from 21 to 1% for high-uncoded CKD stage and from 22 to 2.4% for low total (coded and uncoded) CKD prevalence. Thirty-one practices are identified as outliers for at least one of these outcomes. These can then be categorized into practices needing to address testing, coding or data storage/transfer issues. GP practice prevalence of coded CKD shows wide variation. Accounting for overdispersion is crucial in providing useful information about outlying practices for CKD prevalence. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  1. Accumulate-Repeat-Accumulate-Accumulate-Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Thorpe, Jeremy

    2004-01-01

    Inspired by recently proposed Accumulate-Repeat-Accumulate (ARA) codes [15], in this paper we propose a channel coding scheme called Accumulate-Repeat-Accumulate-Accumulate (ARAA) codes. These codes can be seen as serial turbo-like codes or as a subclass of Low Density Parity Check (LDPC) codes, and they have a projected graph or protograph representation; this allows for a high-speed iterative decoder implementation using belief propagation. An ARAA code can be viewed as a precoded Repeat-and-Accumulate (RA) code with puncturing in concatenation with another accumulator, where simply an accumulator is chosen as the precoder; thus ARAA codes have a very fast encoder structure. Using density evolution on their associated protographs, we find examples of rate-lJ2 ARAA codes with maximum variable node degree 4 for which a minimum bit-SNR as low as 0.21 dB from the channel capacity limit can be achieved as the block size goes to infinity. Such a low threshold cannot be achieved by RA or Irregular RA (IRA) or unstructured irregular LDPC codes with the same constraint on the maximum variable node degree. Furthermore by puncturing the accumulators we can construct families of higher rate ARAA codes with thresholds that stay close to their respective channel capacity thresholds uniformly. Iterative decoding simulation results show comparable performance with the best-known LDPC codes but with very low error floor even at moderate block sizes.

  2. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  3. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  4. Modeling Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Drake, R. P.; Grosskopf, Michael; Bauerle, Matthew; Kruanz, Carolyn; Keiter, Paul; Malamud, Guy; Crash Team

    2013-10-01

    The understanding of high energy density systems can be advanced by laboratory astrophysics experiments. Computer simulations can assist in the design and analysis of these experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport and electron heat conduction. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Radiative shocks experiments, Kelvin-Helmholtz experiments, Rayleigh-Taylor experiments, plasma sheet, and interacting jets experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  5. Speech processing using conditional observable maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John; Nix, David

    A computer implemented method enables the recognition of speech and speech characteristics. Parameters are initialized of first probability density functions that map between the symbols in the vocabulary of one or more sequences of speech codes that represent speech sounds and a continuity map. Parameters are also initialized of second probability density functions that map between the elements in the vocabulary of one or more desired sequences of speech transcription symbols and the continuity map. The parameters of the probability density functions are then trained to maximize the probabilities of the desired sequences of speech-transcription symbols. A new sequence ofmore » speech codes is then input to the continuity map having the trained first and second probability function parameters. A smooth path is identified on the continuity map that has the maximum probability for the new sequence of speech codes. The probability of each speech transcription symbol for each input speech code can then be output.« less

  6. Numerical experiment to estimate the validity of negative ion diagnostic using photo-detachment combined with Langmuir probing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oudini, N.; Sirse, N.; Ellingboe, A. R.

    2015-07-15

    This paper presents a critical assessment of the theory of photo-detachment diagnostic method used to probe the negative ion density and electronegativity α = n{sub -}/n{sub e}. In this method, a laser pulse is used to photo-detach all negative ions located within the electropositive channel (laser spot region). The negative ion density is estimated based on the assumption that the increase of the current collected by an electrostatic probe biased positively to the plasma is a result of only the creation of photo-detached electrons. In parallel, the background electron density and temperature are considered as constants during this diagnostics. While the numericalmore » experiments performed here show that the background electron density and temperature increase due to the formation of an electrostatic potential barrier around the electropositive channel. The time scale of potential barrier rise is about 2 ns, which is comparable to the time required to completely photo-detach the negative ions in the electropositive channel (∼3 ns). We find that neglecting the effect of the potential barrier on the background plasma leads to an erroneous determination of the negative ion density. Moreover, the background electron velocity distribution function within the electropositive channel is not Maxwellian. This is due to the acceleration of these electrons through the electrostatic potential barrier. In this work, the validity of the photo-detachment diagnostic assumptions is questioned and our results illustrate the weakness of these assumptions.« less

  7. Laser-Based Diagnostics for Transient Species in Hydrocarbon Flames

    DTIC Science & Technology

    1989-12-01

    Year , Monek Daey) IS. PAGE COUNT PniIFROM ll/R TO--UR 1 90R 236 16. SUPPLEMENTARY NOTATION 17. COSATI CODES 1S. SUBJECT TERMS (Comnwiu on rewew if...chemical mechanism is to apply species specific diagnostic methods directly to the combustion system of interest. In the past , most optical diagnostic...of the two important radicals HCO and C2H by LIF or absorption methods is several years in the future and will require futher basic studies

  8. Spheromak reactor-design study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Les, J.M.

    1981-06-30

    A general overview of spheromak reactor characteristics, such as MHD stability, start up, and plasma geometry is presented. In addition, comparisons are made between spheromaks, tokamaks and field reversed mirrors. The computer code Sphero is also discussed. Sphero is a zero dimensional time independent transport code that uses particle confinement times and profile parameters as input since they are not known with certainty at the present time. More specifically, Sphero numerically solves a given set of transport equations whose solutions include such variables as fuel ion (deuterium and tritium) density, electron density, alpha particle density and ion, electron temperatures.

  9. Polarization-correlation investigation of biotissue multifractal structure and diagnostics of its pathological change

    NASA Astrophysics Data System (ADS)

    Angelsky, Oleg V.; Pishak, Vasyl P.; Ushenko, Alexander G.; Burkovets, Dimitry N.; Pishak, Olga V.

    2001-05-01

    The paper presents the results of polarization-correlation investigation of multifractal collagen structure of physiologically normal and pathologically changed tissues of women's reproductive sphere and of skin. The technique of polarization selection of coherent biotissues' images followed by determination of their autocorrelation functions and spectral densities is suggested. The correlation- optical criteria of early diagnostics of pathological changes' appearance of myometry (forming of the germ of fibromyoma) and of skin (psoriasis) are determined. The present paper examines the possibilities of diagnostics of pathological changes of biotissues' morphological structure by means of determining the polarizationally filtered autocorrelation functions (ACF) and corresponding spectral densities of their coherent images.

  10. High density harp or wire scanner for particle beam diagnostics

    DOEpatents

    Fritsche, Craig T.; Krogh, Michael L.

    1996-05-21

    A diagnostic detector head harp (23) used to detect and characterize high energy particle beams using an array of closely spaced detector wires (21), typically carbon wires, spaced less than 0.1 cm (0.040 inch) connected to a hybrid microcircuit (25) formed on a ceramic substrate (26). A method to fabricate harps (23) to obtain carbon wire spacing and density not previously available utilizing hybrid microcircuit technology. The hybrid microcircuit (25) disposed on the ceramic substrate (26) connects electrically between the detector wires (21) and diagnostic equipment (37) which analyzes pulses generated in the detector wires (21) by the high energy particle beams.

  11. Real-time interferometric diagnostics of rubidium plasma

    NASA Astrophysics Data System (ADS)

    Djotyan, G. P.; Bakos, J. S.; Kedves, M. Á.; Ráczkevi, B.; Dzsotjan, D.; Varga-Umbrich, K.; Sörlei, Zs.; Szigeti, J.; Ignácz, P.; Lévai, P.; Czitrovszky, A.; Nagy, A.; Dombi, P.; Rácz, P.

    2018-03-01

    A method of interferometric real-time diagnostics is developed and applied to rubidium plasma created by strong laser pulses in the femtosecond duration range at different initial rubidium vapor densities using a Michelson-type interferometer. A cosine fit with an exponentially decaying relative phase is applied to the obtained time-dependent interferometry signals to measure the density-length product of the created plasma and its recombination time constant. The presented technique may be applicable for real-time measurements of rubidium plasma dynamics in the AWAKE experiment at CERN, as well as for real-time diagnostics of plasmas created in different gaseous media and on surfaces of solid targets.

  12. Postdeployment Hospitalizations among Service Members Deployed in Support of the Operations in Iraq and Afghanistan

    DTIC Science & Technology

    2009-09-01

    of the lung 329 (17.5) 493 Asthma 297 (15.8) 486 Pneumonia, organism unspecified 209 (11.1) Digestive system diseases (codes 520–579) 540 Acute...fallopian tube, pelvic cellular tissue, and peritoneum 122 (6.2) Skin diseases (codes 680–709) 682 Other cellulitis and abscess 559 (53.5) 685...true measure of morbidity for categories such as mental health (31, 32). Further, the use of diagnostic coding for conditions such as cancer with a

  13. Radiation effects in IFMIF Li target diagnostic systems

    NASA Astrophysics Data System (ADS)

    Molla, J.; Vila, R.; Shikama, T.; Horiike, H.; Simakov, S.; Ciotti, M.; Ibarra, A.

    2009-04-01

    Diagnostics for the lithium target will be crucial for the operation of IFMIF. Several parameters as the lithium temperature, target thickness or wave pattern must be monitored during operation. Radiation effects may produce malfunctioning in any of these diagnostics due to the exposure to high radiation fields. The main diagnostic systems proposed for the operation of IFMIF are reviewed in this paper from the point of view of radiation damage. The main tools for the assessment of the performance of these diagnostics are the neutronics calculations by using specialised codes and the information accumulated during the last decades on the radiation effects in functional materials, components and diagnostics for ITER. This analysis allows to conclude that the design of some of the diagnostic systems must be revised to assure the high availability required for the target system.

  14. Diagnostics and results from coaxial plasma gun development for the PLX- α project

    NASA Astrophysics Data System (ADS)

    Case, A.; Brockington, S.; Cruz, E.; Witherspoon, F. D.

    2016-10-01

    We present results from the diagnostics used during development of the contoured gap coaxial plasma guns for the PLX- α project at LANL. Plasma-jet diagnostics include fast photodiodes for velocimetry, a ballistic pendulum for total plasmoid momentum, and interferometry for line integrated density. Deflectometry will be used for line integrated perpendicular density gradients. Time-resolved high-resolution spectroscopy using a novel detector and time-integrated survey spectroscopy are used for measurements of velocity and temperature, as well as impurities. We will also use a Faraday cup for density, fast imaging for plume geometry, and time-integrated imaging for overall light emission. Experimental results are compared to the desired target parameters for the plasma jets (up to n 2 ×1016cm-3 , v 50km / s , mass 5gm , radius = 4cm , and length 10cm). This work supported by the ARPA-E ALPHA Program.

  15. A temporally and spatially resolved electron density diagnostic method for the edge plasma based on Stark broadening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zafar, A., E-mail: zafara@ornl.gov; Oak Ridge National Laboratory, Oak Ridge, Tennessee 37830; Martin, E. H.

    2016-11-15

    An electron density diagnostic (≥10{sup 10} cm{sup −3}) capable of high temporal (ms) and spatial (mm) resolution is currently under development at Oak Ridge National Laboratory. The diagnostic is based on measuring the Stark broadened, Doppler-free spectral line profile of the n = 6–2 hydrogen Balmer series transition. The profile is then fit to a fully quantum mechanical model including the appropriate electric and magnetic field operators. The quasi-static approach used to calculate the Doppler-free spectral line profile is outlined here and the results from the model are presented for H-δ spectra for electron densities of 10{sup 10}–10{sup 13} cm{supmore » −3}. The profile shows complex behavior due to the interaction between the magnetic substates of the atom.« less

  16. Status of Real-Time Laser Based Ion Engine Diagnostics at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Domonkos, Matthew T.; Williams, George J., Jr.

    2001-01-01

    The development status of laser based erosion diagnostics for ion engines at the NASA Glenn Research Center is discussed. The diagnostics are being developed to enhance component life-prediction capabilities. A direct measurement of the erosion product density using laser induced fluorescence (LIF) is described. Erosion diagnostics based upon evaluation of the ion dynamics are also under development, and the basic approach is presented. The planned implementation of the diagnostics is discussed.

  17. Advanced Concept Exploration for Fast Ignition Science Program, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, Richard Burnite; McLean, Harry M.; Theobald, Wolfgang

    The Fast Ignition (FI) Concept for Inertial Confinement Fusion (ICF) has the potential to provide a significant advance in the technical attractiveness of Inertial Fusion Energy reactors. FI differs from conventional “central hot spot” (CHS) target ignition by decoupling compression from heating: using a laser (or heavy ion beam or Z pinch) drive pulse (10’s of nanoseconds) to create a dense fuel and a second, much shorter (~10 picoseconds) high intensity pulse to ignite a small volume within the dense fuel. The physics of fast ignition process was the focus of our Advanced Concept Exploration (ACE) program. Ignition depends criticallymore » on two major issues involving Relativistic High Energy Density (RHED) physics: The laser-induced creation of fast electrons and their propagation in high-density plasmas. Our program has developed new experimental platforms, diagnostic packages, computer modeling analyses, and taken advantage of the increasing energy available at laser facilities to advance understanding of the fundamental physics underlying these issues. Our program had three thrust areas: • Understand the production and characteristics of fast electrons resulting from FI relevant laser-plasma interactions and their dependence on laser prepulse and laser pulse length. • Investigate the subsequent fast electron transport in solid and through hot (FI-relevant) plasmas. • Conduct and understand integrated core-heating experiments by comparison to simulations. Over the whole period of this project (three years for this contract), we have greatly advanced our fundamental understanding of the underlying properties in all three areas: • Comprehensive studies on fast electron source characteristics have shown that they are controlled by the laser intensity distribution and the topology and plasma density gradient. Laser pre-pulse induced pre-plasma in front of a solid surface results in increased stand-off distances from the electron origin to the high density target as well as large and erratic spread of the electron beam with increasing short pulse duration. We have demonstrated, using newly available higher contrast lasers, an improved energy coupling, painting a promising picture for FI feasibility. • Our detailed experiments and analyses of fast electron transport dependence on target material have shown that it is feasible to collimate fast electron beam by self-generated resistive magnetic fields in engineered targets with a rather simple geometry. Stable and collimated electron beam with spot size as small as 50-μm after >100-μm propagation distance (an angular divergence angle of 20°!) in solid density plasma targets has been demonstrated with FI-relevant (10-ps, >1-kJ) laser pulses Such collimated beam would meet the required heating beam size for FI. • Our new experimental platforms developed for the OMEGA laser (i.e., i) high resolution 8 keV backlighter platform for cone-in-shell implosion and ii) the 8 keV imaging with Cu-doped shell targets for detailed transport characterization) have enabled us to experimentally confirm fuel assembly from cone-in-shell implosion with record-high areal density. We have also made the first direct measurement of fast electron transport and spatial energy deposition in integrated FI experiments enabling the first experiment-based benchmarking of integrated simulation codes. Executing this program required a large team. It was managed as a collaboration between General Atomics (GA), Lawrence Livermore National Laboratory (LLNL), and the Laboratory for Laser Energetics (LLE). GA fulfills its responsibilities jointly with the University of California, San Diego (UCSD), The Ohio State University (OSU) and the University of Nevada at Reno (UNR). The division of responsibility was as follows: (1) LLE had primary leadership for channeling studies and the integrated energy transfer, (2) LLNL led the development of measurement methods, analysis, and deployment of diagnostics, and (3) GA together with UCSD, OSU and UNR studied the detailed energy-transfer physics. The experimental program was carried out using the Titan laser at the Jupiter Laser Facility at LLNL, the OMEGA and OMEGA EP lasers at LLE and the Texas Petawatt laser at the University of Texas, Austin. Modeling has been pursued on large computing facilities at LLNL, OSU, and UCSD using codes developed (by us and others) within the HEDLP program, commercial codes, and by leveraging existing simulations codes developed by the National Nuclear Security Administration ICF program. One important aspect of this program was the involvement and training of young scientists including postdoctoral fellows and graduate students. This project generated an impressive forty articles in high quality journals including nine (two under review) in Physical Review Letters during the three years of this grant and five graduate students completed their doctoral dissertations.« less

  18. A Cross-Sectional Prevalence Study of Ethnically Targeted and General Audience Outdoor Obesity-Related Advertising

    PubMed Central

    Yancey, Antronette K; Cole, Brian L; Brown, Rochelle; Williams, Jerome D; Hillier, Amy; Kline, Randolph S; Ashe, Marice; Grier, Sonya A; Backman, Desiree; McCarthy, William J

    2009-01-01

    Context: Commercial marketing is a critical but understudied element of the sociocultural environment influencing Americans' food and beverage preferences and purchases. This marketing also likely influences the utilization of goods and services related to physical activity and sedentary behavior. A growing literature documents the targeting of racial/ethnic and income groups in commercial advertisements in magazines, on billboards, and on television that may contribute to sociodemographic disparities in obesity and chronic disease risk and protective behaviors. This article examines whether African Americans, Latinos, and people living in low-income neighborhoods are disproportionately exposed to advertisements for high-calorie, low nutrient–dense foods and beverages and for sedentary entertainment and transportation and are relatively underexposed to advertising for nutritious foods and beverages and goods and services promoting physical activities. Methods: Outdoor advertising density and content were compared in zip code areas selected to offer contrasts by area income and ethnicity in four cities: Los Angeles, Austin, New York City, and Philadelphia. Findings: Large variations were observed in the amount, type, and value of advertising in the selected zip code areas. Living in an upper-income neighborhood, regardless of its residents' predominant ethnicity, is generally protective against exposure to most types of obesity-promoting outdoor advertising (food, fast food, sugary beverages, sedentary entertainment, and transportation). The density of advertising varied by zip code area race/ethnicity, with African American zip code areas having the highest advertising densities, Latino zip code areas having slightly lower densities, and white zip code areas having the lowest densities. Conclusions: The potential health and economic implications of differential exposure to obesity-related advertising are substantial. Although substantive legal questions remain about the government's ability to regulate advertising, the success of limiting tobacco advertising offers lessons for reducing the marketing contribution to the obesigenicity of urban environments. PMID:19298419

  19. Recommendations for the standardization and interpretation of the electrocardiogram: part II: Electrocardiography diagnostic statement list: a scientific statement from the American Heart Association Electrocardiography and Arrhythmias Committee, Council on Clinical Cardiology; the American College of Cardiology Foundation; and the Heart Rhythm Society: endorsed by the International Society for Computerized Electrocardiology.

    PubMed

    Mason, Jay W; Hancock, E William; Gettes, Leonard S; Bailey, James J; Childers, Rory; Deal, Barbara J; Josephson, Mark; Kligfield, Paul; Kors, Jan A; Macfarlane, Peter; Pahlm, Olle; Mirvis, David M; Okin, Peter; Rautaharju, Pentti; Surawicz, Borys; van Herpen, Gerard; Wagner, Galen S; Wellens, Hein

    2007-03-13

    This statement provides a concise list of diagnostic terms for ECG interpretation that can be shared by students, teachers, and readers of electrocardiography. This effort was motivated by the existence of multiple automated diagnostic code sets containing imprecise and overlapping terms. An intended outcome of this statement list is greater uniformity of ECG diagnosis and a resultant improvement in patient care. The lexicon includes primary diagnostic statements, secondary diagnostic statements, modifiers, and statements for the comparison of ECGs. This diagnostic lexicon should be reviewed and updated periodically.

  20. Recommendations for the standardization and interpretation of the electrocardiogram: part II: electrocardiography diagnostic statement list a scientific statement from the American Heart Association Electrocardiography and Arrhythmias Committee, Council on Clinical Cardiology; the American College of Cardiology Foundation; and the Heart Rhythm Society Endorsed by the International Society for Computerized Electrocardiology.

    PubMed

    Mason, Jay W; Hancock, E William; Gettes, Leonard S; Bailey, James J; Childers, Rory; Deal, Barbara J; Josephson, Mark; Kligfield, Paul; Kors, Jan A; Macfarlane, Peter; Pahlm, Olle; Mirvis, David M; Okin, Peter; Rautaharju, Pentti; Surawicz, Borys; van Herpen, Gerard; Wagner, Galen S; Wellens, Hein

    2007-03-13

    This statement provides a concise list of diagnostic terms for ECG interpretation that can be shared by students, teachers, and readers of electrocardiography. This effort was motivated by the existence of multiple automated diagnostic code sets containing imprecise and overlapping terms. An intended outcome of this statement list is greater uniformity of ECG diagnosis and a resultant improvement in patient care. The lexicon includes primary diagnostic statements, secondary diagnostic statements, modifiers, and statements for the comparison of ECGs. This diagnostic lexicon should be reviewed and updated periodically.

  1. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    DOE PAGES

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; ...

    2016-12-20

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  2. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  3. THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain

    Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt–Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∼3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilkenny, J.; Richau, G.; Sangster, C.

    A major goal of the Stockpile Stewardship Program (SSP) is to deliver validated numerical models, benchmarked against experiments that address relevant and important issues and provide data that stress the codes and our understanding. DOENNSA has made significant investments in major facilities and high-performance computing to successfully execute the SSP. The more information obtained about the physical state of the plasmas produced, the more stringent the test of theories, models, and codes can be, leading to increased confidence in our predictive capability. To fully exploit the world-leading capabilities of the ICF program, a multi-year program to develop and deploy advancedmore » diagnostics has been developed by the expert scientific community. To formalize these activities NNSA’s Acting Director for the Inertial Confinement Fusion Program directed the formation and duties of the National Diagnostics Working Group (NDWG) in a Memorandum 11/3/16 (Appendix A). The NDWG identified eight transformational diagnostics, shown in Table 1, that will provide unprecedented information from experiments in support of the SSP at NIF, Z and OMEGA. Table 1 shows how the missions of the SSP experiments including materials, complex hydrodynamics, radiation flow and effects and thermo-nuclear burn and boost will produce new observables, which will be measured using a variety of largely new diagnostic technologies used in the eight transformational diagnostics. The data provided by these diagnostics will validate and improve the physics contained within the SSP’s simulations and both uncover and quantify important phenomena that lie beyond our present understanding.« less

  5. Development and Benchmarking of a Hybrid PIC Code For Dense Plasmas and Fast Ignition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witherspoon, F. Douglas; Welch, Dale R.; Thompson, John R.

    Radiation processes play an important role in the study of both fast ignition and other inertial confinement schemes, such as plasma jet driven magneto-inertial fusion, both in their effect on energy balance, and in generating diagnostic signals. In the latter case, warm and hot dense matter may be produced by the convergence of a plasma shell formed by the merging of an assembly of high Mach number plasma jets. This innovative approach has the potential advantage of creating matter of high energy densities in voluminous amount compared with high power lasers or particle beams. An important application of this technologymore » is as a plasma liner for the flux compression of magnetized plasma to create ultra-high magnetic fields and burning plasmas. HyperV Technologies Corp. has been developing plasma jet accelerator technology in both coaxial and linear railgun geometries to produce plasma jets of sufficient mass, density, and velocity to create such imploding plasma liners. An enabling tool for the development of this technology is the ability to model the plasma dynamics, not only in the accelerators themselves, but also in the resulting magnetized target plasma and within the merging/interacting plasma jets during transport to the target. Welch pioneered numerical modeling of such plasmas (including for fast ignition) using the LSP simulation code. Lsp is an electromagnetic, parallelized, plasma simulation code under development since 1995. It has a number of innovative features making it uniquely suitable for modeling high energy density plasmas including a hybrid fluid model for electrons that allows electrons in dense plasmas to be modeled with a kinetic or fluid treatment as appropriate. In addition to in-house use at Voss Scientific, several groups carrying out research in Fast Ignition (LLNL, SNL, UCSD, AWE (UK), and Imperial College (UK)) also use LSP. A collaborative team consisting of HyperV Technologies Corp., Voss Scientific LLC, FAR-TECH, Inc., Prism Computational Sciences, Inc. and Advanced Energy Systems Inc. joined efforts to develop new physics and numerical models for LSP in several key areas to enhance the ability of LSP to model high energy density plasmas (HEDP). This final report details those efforts. Areas addressed in this research effort include: adding radiation transport to LSP, first in 2D and then fully 3D, extending the EMHD model to 3D, implementing more advanced radiation and electrode plasma boundary conditions, and installing more efficient implicit numerical algorithms to speed complex 2-D and 3-D computations. The new capabilities allow modeling of the dominant processes in high energy density plasmas, and further assist the development and optimization of plasma jet accelerators, with particular attention to MHD instabilities and plasma/wall interaction (based on physical models for ion drag friction and ablation/erosion of the electrodes). In the first funding cycle we implemented a solver for the radiation diffusion equation. To solve this equation in 2-D, we used finite-differencing and applied the parallelized sparse-matrix solvers in the PETSc library (Argonne National Laboratory) to the resulting system of equations. A database of the necessary coefficients for materials of interest was assembled using the PROPACEOS and ATBASE codes from Prism. The model was benchmarked against Prism's 1-D radiation hydrodynamics code HELIOS, and against experimental data obtained from HyperV's separately funded plasma jet accelerator development program. Work in the second funding cycle focused on extending the radiation diffusion model to full 3-D, continued development of the EMHD model, optimizing the direct-implicit model to speed up calculations, add in multiply ionized atoms, and improved the way boundary conditions are handled in LSP. These new LSP capabilities were then used, along with analytic calculations and Mach2 runs, to investigate plasma jet merging, plasma detachment and transport, restrike and advanced jet accelerator design. In addition, a strong linkage to diagnostic measurements was made by modeling plasma jet experiments on PLX to support benchmarking of the code. A large number of upgrades and improvements advancing hybrid PIC algorithms were implemented in LSP during the second funding cycle. These include development of fully 3D radiation transport algorithms, new boundary conditions for plasma-electrode interactions, and a charge conserving equation of state that permits multiply ionized high-Z ions. The final funding cycle focused on 1) mitigating the effects of a slow-growing grid instability which is most pronounced in plasma jet frame expansion problems using the two-fluid Eulerian remap algorithm, 2) extension of the Eulerian Smoothing Algorithm to allow EOS/Radiation modeling, 3) simulations of collisionless shocks formed by jet merging, 4) simulations of merging jets using high-Z gases, 5) generation of PROPACEOS EOS/Opacity databases, 6) simulations of plasma jet transport experiments, 7) simulations of plasma jet penetration through transverse magnetic fields, and 8) GPU PIC code development The tools developed during this project are applicable not only to the study of plasma jets, but also to a wide variety of HEDP plasmas of interest to DOE, including plasmas created in short-pulse laser experiments performed to study fast ignition concepts for inertial confinement fusion.« less

  6. Tranexamic Acid Use in US Children’s Hospitals

    PubMed Central

    Nishijima, Daniel K.; Monuteaux, Michael C.; Faraoni, David; Goobie, Susan M.; Lee, Lois; Galante, Joseph; Holmes, James F.; Kuppermann, Nathan

    2016-01-01

    Background The prevalence of tranexamic acid (TXA) use for trauma and other conditions in children is unknown. Objective The objective of this study was to describe the use of TXA in US children’s hospitals for children in general, and specifically for trauma. Methods We conducted a secondary analysis of a large, administrative database of 36 US children’s hospitals. We included children younger than 18 years who received TXA (based on pharmacy charge codes) from 2009 to 2013. Patients were grouped into the following diagnostic categories: trauma, congenital heart surgery, scoliosis surgery, craniosynostosis/craniofacial surgery, and other, based on ICD-9 principle procedure and diagnostic codes. TXA administration and dosage, in-hospital clinical variables, and diagnostic and procedure codes were documented. Results A total of 35,478 pediatric encounters with a TXA charge were included in the study cohort. The proportions of children who received TXA were similar across the years 2009–2013. Only 110 encounters (0.31%) were for traumatic conditions. Congenital heart surgery accounted for more than one-half of the encounters (22,863, 64%).. Overall the median estimated weight-based dose of TXA was 22.4 mg/kg (IQR 7.3 to 84.9 mg/kg). Conclusions We identified a wide frequency of use and range of doses of TXA for several diagnostic conditions in children. The use of TXA among injured children, however, appears to be rare despite its common use and efficacy among injured adults. Further work is needed to recommend appropriate indications for TXA, and provide dosage guidelines among children with a variety of conditions, including trauma. PMID:27017532

  7. Laser program annual report, 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, L.W.; Strack, J.R.

    1980-03-01

    This volume contains four sections that covers the areas of target design, target fabrication, diagnostics, and experiments. Section 3 reports on target design activities, plasma theory and simulation, code development, and atomic theory. Section 4 presents the accomplishments of the target fabrication group, and Section 5 presents results of diagnostic developments and applications for the year. The results of laser-target experiments are presented. (MOW)

  8. 40 CFR 86.1 - Reference materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....007-17, 86.1806-01, 86.1806-04, 86.1806-05. (xxxii) SAE J1979, Revised May 2007, (R) E/E Diagnostic... J2012, Revised April 2002, (R) Diagnostic Trouble Code Definitions Equivalent to ISO/DIS 15031-6: April... Vehicle Programming, IBR approved for §§ 86.096-38, 86.004-38, 86.007-38, 86.010-38, 86.1808-01, 86.1808...

  9. Development of an EMC3-EIRENE Synthetic Imaging Diagnostic

    NASA Astrophysics Data System (ADS)

    Meyer, William; Allen, Steve; Samuell, Cameron; Lore, Jeremy

    2017-10-01

    2D and 3D flow measurements are critical for validating numerical codes such as EMC3-EIRENE. Toroidal symmetry assumptions preclude tomographic reconstruction of 3D flows from single camera views. In addition, the resolution of the grids utilized in numerical code models can easily surpass the resolution of physical camera diagnostic geometries. For these reasons we have developed a Synthetic Imaging Diagnostic capability for forward projection comparisons of EMC3-EIRENE model solutions with the line integrated images from the Doppler Coherence Imaging diagnostic on DIII-D. The forward projection matrix is 2.8 Mpixel by 6.4 Mcells for the non-axisymmetric case we present. For flow comparisons, both simple line integral, and field aligned component matrices must be calculated. The calculation of these matrices is a massive embarrassingly parallel problem and performed with a custom dispatcher that allows processing platforms to join mid-problem as they become available, or drop out if resources are needed for higher priority tasks. The matrices are handled using standard sparse matrix techniques. Prepared by LLNL under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. DOE, Office of Science, Office of Fusion Energy Sciences. LLNL-ABS-734800.

  10. Evaluation of thermal helium beam and line-ratio fast diagnostic on the National Spherical Torus Experiment-Upgrade

    DOE PAGES

    Munoz Burgos, Jorge M.; Agostini, Matteo; Scarin, Paolo; ...

    2015-05-06

    A 1-D kinetic collisional radiative model (CRM) with state-of-the-art atomic data is developed and employed to simulate line emission to evaluate the Thermal Helium Beam (THB) diagnostic on NSTX-U. This diagnostic is currently in operation on RFX-mod, and it is proposed to be installed on NSTX-U. The THB system uses the intensity ratios of neutral helium lines 667.8, 706.5, and 728.1 nm to derive electron temperature (eV ) and density (cm –3) profiles. The purpose of the present analysis is to evaluate the applications of this diagnostic for determining fast (~4 μs) electron temperature and density radial profiles on themore » scrape-off layer (SOL) and edge regions of NSTX-U that are needed in turbulence studies. The diagnostic is limited by the level of detection of the 728.1 nm line, which is the weakest of the three. In conclusion, this study will also aid in future design of a similar 2-D diagnostic systems on the divertor.« less

  11. Diagnostic and Demographic Differences Between Incarcerated and Nonincarcerated Youth (Ages 6-15) With ADHD in South Carolina.

    PubMed

    Soltis, Samuel L; Probst, Janice; Xirasagar, Sudha; Martin, Amy B; Smith, Bradley H

    2017-05-01

    Analyze diagnostic and demographic factors to identify predictors of delinquency resulting in incarceration within a group of children/adolescents diagnosed with ADHD. The study followed a cohort of 15,472 Medicaid covered children/adolescents with ADHD, ages 6 to 15 inclusive, between January 1, 2003, and December 31, 2006. The Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev. [ DSM-IV-TR]), 2000 Codes were used for qualifying diagnosis codes. Available demographic characteristics included race, sex, and residence. The outcome was incarceration at the South Carolina Department of Juvenile Justice during 2005-2006. Among youth with ADHD, incarceration was more likely among black, male, and urban youth. Children/adolescents with comorbid ODD and/or CD were at greater risk compared with those with ADHD alone. Within ADHD-diagnosed youth, comorbid conditions and demographic characteristics increase the risk of incarceration. Intervention and treatment strategies that address behavior among youth with these characteristics are needed to reduce incarceration.

  12. Fusion programs in applied plasma physics

    NASA Astrophysics Data System (ADS)

    1992-07-01

    The Applied Plasma Physics (APP) program at General Atomics (GA) described here includes four major elements: (1) Applied Plasma Physics Theory Program, (2) Alpha Particle Diagnostic, (3) Edge and Current Density Diagnostic, and (4) Fusion User Service Center (USC). The objective of the APP theoretical plasma physics research at GA is to support the DIII-D and other tokamak experiments and to significantly advance our ability to design a commercially-attractive fusion reactor. We categorize our efforts in three areas: magnetohydrodynamic (MHD) equilibria and stability; plasma transport with emphasis on H-mode, divertor, and boundary physics; and radio frequency (RF). The objective of the APP alpha particle diagnostic is to develop diagnostics of fast confined alpha particles using the interactions with the ablation cloud surrounding injected pellets and to develop diagnostic systems for reacting and ignited plasmas. The objective of the APP edge and current density diagnostic is to first develop a lithium beam diagnostic system for edge fluctuation studies on the Texas Experimental Tokamak (TEXT). The objective of the Fusion USC is to continue to provide maintenance and programming support to computer users in the GA fusion community. The detailed progress of each separate program covered in this report period is described.

  13. Spatial panel analyses of alcohol outlets and motor vehicle crashes in California: 1999–2008

    PubMed Central

    Ponicki, William R.; Gruenewald, Paul J.; Remer, Lillian G.

    2014-01-01

    Although past research has linked alcohol outlet density to higher rates of drinking and many related social problems, there is conflicting evidence of density’s association with traffic crashes. An abundance of local alcohol outlets simultaneously encourages drinking and reduces driving distances required to obtain alcohol, leading to an indeterminate expected impact on alcohol-involved crash risk. This study separately investigates the effects of outlet density on (1) the risk of injury crashes relative to population and (2) the likelihood that any given crash is alcohol-involved, as indicated by police reports and single-vehicle nighttime status of crashes. Alcohol outlet density effects are estimated using Bayesian misalignment Poisson analyses of all California ZIP codes over the years 1999–2008. These misalignment models allow panel analysis of ZIP-code data despite frequent redefinition of postal-code boundaries, while also controlling for overdispersion and the effects of spatial autocorrelation. Because models control for overall retail density, estimated alcohol-outlet associations represent the extra effect of retail establishments selling alcohol. The results indicate a number of statistically well-supported associations between retail density and crash behavior, but the implied effects on crash risks are relatively small. Alcohol-serving restaurants have a greater impact on overall crash risks than on the likelihood that those crashes involve alcohol, whereas bars primarily affect the odds that crashes are alcohol-involved. Off-premise outlet density is negatively associated with risks of both crashes and alcohol involvement, while the presence of a tribal casino in a ZIP code is linked to higher odds of police-reported drinking involvement. Alcohol outlets in a given area are found to influence crash risks both locally and in adjacent ZIP codes, and significant spatial autocorrelation also suggests important relationships across geographical units. These results suggest that each type of alcohol outlet can have differing impacts on risks of crashing as well as the alcohol involvement of those crashes. PMID:23537623

  14. SPAMCART: a code for smoothed particle Monte Carlo radiative transfer

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2016-10-01

    We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.

  15. Recent Updates to the MELCOR 1.8.2 Code for ITER Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, Brad J

    This report documents recent changes made to the MELCOR 1.8.2 computer code for application to the International Thermonuclear Experimental Reactor (ITER), as required by ITER Task Agreement ITA 81-18. There are four areas of change documented by this report. The first area is the addition to this code of a model for transporting HTO. The second area is the updating of the material oxidation correlations to match those specified in the ITER Safety Analysis Data List (SADL). The third area replaces a modification to an aerosol tranpsort subroutine that specified the nominal aerosol density internally with one that now allowsmore » the user to specify this density through user input. The fourth area corrected an error that existed in an air condensation subroutine of previous versions of this modified MELCOR code. The appendices of this report contain FORTRAN listings of the coding for these modifications.« less

  16. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  17. Spectroscopic diagnostics of tungsten-doped CH plasmas

    NASA Astrophysics Data System (ADS)

    Klapisch, M.; Colombant, D.; Lehecka, T.

    1998-11-01

    Spectra of CH with different concentrations of W dopant and laser intensities ( 2.5-10 x10^12 W/cm^2 ) were obtained at NRL with the Nike Laser. They were recorded in the 100-500 eV range with an XUV grating spectrometer. The hydrodynamic simulations are performed with the 1D code FAST1D(J. H. Gardner et al., Phys. Plasmas, 5, May (1998).) where non LTE effects are introduced by Busquet's model( M. Busquet, Phys. Fluids B, 5, 4191 (1993); M. Klapisch, A. Bar-Shalom, J. Oreg and D. Colombant, Phys. Plasmas, 5, May (1998).). They are then post-processed with TRANSPEC( O. Peyrusse, J. Quant. Spectrosc. Radiat. Transfer, 51, 281 (1994)), a time dependent collisional radiative code with radiation coupling. The necessary atomic data are obtained from the HULLAC code( M. Klapisch and A. Bar-Shalom, J. Quant. Spectrosc. Radiat. Transfer, 58, 687 (1997).). The post processing and diagnostics were performed on carbon lines and the results are compared with the experimental data.

  18. Laboratory testing for cytomegalovirus among pregnant women in the United States: a retrospective study using administrative claims data

    PubMed Central

    2012-01-01

    Background Routine cytomegalovirus (CMV) screening during pregnancy is not recommended in the United States and the extent to which it is performed is unknown. Using a medical claims database, we computed rates of CMV-specific testing among pregnant women. Methods We used medical claims from the 2009 Truven Health MarketScan® Commercial databases. We computed CMV-specific testing rates using CPT codes. Results We identified 77,773 pregnant women, of whom 1,668 (2%) had a claim for CMV-specific testing. CMV-specific testing was significantly associated with older age, Northeast or urban residence, and a diagnostic code for mononucleosis. We identified 44 women with a diagnostic code for mononucleosis, of whom 14% had CMV-specific testing. Conclusions Few pregnant women had CMV-specific testing, suggesting that screening for CMV infection during pregnancy is not commonly performed. In the absence of national surveillance for CMV infections during pregnancy, healthcare claims are a potential source for monitoring practices of CMV-specific testing. PMID:23198949

  19. High-Risk Series: An Update

    DTIC Science & Technology

    2015-02-01

    monitoring of veterans with major depressive disorder (MDD) and whether those who are prescribed an antidepressant receive recommended care, we...determined that VA data may underestimate the prevalence of major depressive disorder among veterans and that a lack of training for VA clinicians on...not always appropriately coded encounters with veterans they diagnosed as having MDD, instead using a less specific diagnostic code for “ depression

  20. Hydrodynamics simulations of 2{omega} laser propagation in underdense gasbag plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meezan, N.B.; Divol, L.; Marinak, M.M.

    2004-12-01

    Recent 2{omega} laser propagation and stimulated Raman backscatter (SRS) experiments performed on the Helen laser have been analyzed using the radiation-hydrodynamics code HYDRA [M. M. Marinak, G. D. Kerbel, N. A. Gentile, O. Jones, D. Munro, S. Pollaine, T. R. Dittrich, and S. W. Haan, Phys. Plasmas 8, 2275 (2001)]. These experiments utilized two diagnostics sensitive to the hydrodynamics of gasbag targets: a fast x-ray framing camera (FXI) and a SRS streak spectrometer. With a newly implemented nonlocal thermal transport model, HYDRA is able to reproduce many features seen in the FXI images and the SRS streak spectra. Experimental andmore » simulated side-on FXI images suggest that propagation can be explained by classical laser absorption and the resulting hydrodynamics. Synthetic SRS spectra generated from the HYDRA results reproduce the details of the experimental SRS streak spectra. Most features in the synthetic spectra can be explained solely by axial density and temperature gradients. The total SRS backscatter increases with initial gasbag fill density up to {approx_equal}0.08 times the critical density, then decreases. Data from a near-backscatter imaging camera show that severe beam spray is not responsible for the trend in total backscatter. Filamentation does not appear to be a significant factor in gasbag hydrodynamics. The simulation and analysis techniques established here can be used in ongoing experimental campaigns on the Omega laser facility and the National Ignition Facility.« less

  1. Simulated performance of the optical Thomson scattering diagnostic designed for the National Ignition Facility.

    PubMed

    Ross, J S; Datte, P; Divol, L; Galbraith, J; Froula, D H; Glenzer, S H; Hatch, B; Katz, J; Kilkenny, J; Landen, O; Manuel, A M; Molander, W; Montgomery, D S; Moody, J D; Swadling, G; Weaver, J

    2016-11-01

    An optical Thomson scattering diagnostic has been designed for the National Ignition Facility to characterize under-dense plasmas. We report on the design of the system and the expected performance for different target configurations. The diagnostic is designed to spatially and temporally resolve the Thomson scattered light from laser driven targets. The diagnostic will collect scattered light from a 50 × 50 × 200 μm volume. The optical design allows operation with different probe laser wavelengths. A deep-UV probe beam (λ 0 = 210 nm) will be used to Thomson scatter from electron plasma densities of ∼5 × 10 20 cm -3 while a 3ω probe will be used for plasma densities of ∼1 × 10 19 cm -3 . The diagnostic package contains two spectrometers: the first to resolve Thomson scattering from ion acoustic wave fluctuations and the second to resolve scattering from electron plasma wave fluctuations. Expected signal levels relative to background will be presented for typical target configurations (hohlraums and a planar foil).

  2. Alfven resonance mode conversion in the Phaedrus-T current drive experiments: Modelling and density fluctuations measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vukovic, M.; Harper, M.; Breun, R.

    1995-12-31

    Current drive experiments on the Phaedrus-T tokamak performed with a low field side two-strap fast wave antenna at frequencies below {omega}{sub cH} show loop volt drops of up to 30% with strap phasing (0, {pi}/2). RF induced density fluctuations in the plasma core have also been observed with a microwave reflectometer. It is believed that they are caused by kinetic Alfven waves generated by mode conversion of fast waves at the Alfven resonance. Correlation of the observed density fluctuations with the magnitude of the {Delta}V{sub loop} suggest that the {Delta}V{sub loop} is attributable to current drive/heating due to mode convertedmore » kinetic Alfven waves. The toroidal cold plasma wave code LION is used to model the Alfven resonance mode conversion surfaces in the experiments while the cylindrical hot plasma kinetic wave code ISMENE is used to model the behavior of kinetic Alfven waves at the Alfven resonance location. Initial results obtained from limited density, magnetic field, antenna phase, and impurity scans show good agreement between the RF induced density fluctuations and the predicted behavior of the kinetic Alfven waves. Detailed comparisons between the density fluctuations and the code predictions are presented.« less

  3. Design of the radiation shielding for the time of flight enhanced diagnostics neutron spectrometer at Experimental Advanced Superconducting Tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, T. F.; Chen, Z. J.; Peng, X. Y.

    A radiation shielding has been designed to reduce scattered neutrons and background gamma-rays for the new double-ring Time Of Flight Enhanced Diagnostics (TOFED). The shielding was designed based on simulation with the Monte Carlo code MCNP5. Dedicated model of the EAST tokamak has been developed together with the emission neutron source profile and spectrum; the latter were simulated with the Nubeam and GENESIS codes. Significant reduction of background radiation at the detector can be achieved and this satisfies the requirement of TOFED. The intensities of the scattered and direct neutrons in the line of sight of the TOFED neutron spectrometermore » at EAST are studied for future data interpretation.« less

  4. 3D Vectorial Time Domain Computational Integrated Photonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, J S; Bond, T C; Koning, J M

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market,more » they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the microchip laser logic devices as well as devices characterized by electromagnetic (EM) propagation in nonlinear materials with time-varying parameters. The deliverables for this project were extended versions of the laser logic device code Quench2D and the EM propagation code EMsolve with new modules containing the novel solutions incorporated by taking advantage of the existing software interface and structured computational modules. Our approach was multi-faceted since no single methodology can always satisfy the tradeoff between model runtime and accuracy requirements. We divided the problems to be solved into two main categories: those that required Full Wave Methods and those that could be modeled using Approximate Methods. Full Wave techniques are useful in situations where Maxwell's equations are not separable (or the problem is small in space and time), while approximate techniques can treat many of the remaining cases.« less

  5. A recoverable gas-cell diagnostic for the National Ignition Facility.

    PubMed

    Ratkiewicz, A; Berzak Hopkins, L; Bleuel, D L; Bernstein, L A; van Bibber, K; Cassata, W S; Goldblum, B L; Siem, S; Velsko, C A; Wiedeking, M; Yeamans, C B

    2016-11-01

    The high-fluence neutron spectrum produced by the National Ignition Facility (NIF) provides an opportunity to measure the activation of materials by fast-spectrum neutrons. A new large-volume gas-cell diagnostic has been designed and qualified to measure the activation of gaseous substances at the NIF. This in-chamber diagnostic is recoverable, reusable and has been successfully fielded. Data from the qualification of the diagnostic have been used to benchmark an Monte Carlo N-Particle Transport Code simulation describing the downscattered neutron spectrum seen by the gas cell. We present early results from the use of this diagnostic to measure the activation of nat Xe and discuss future work to study the strength of interactions between plasma and nuclei.

  6. NR-code: Nonlinear reconstruction code

    NASA Astrophysics Data System (ADS)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  7. Simulation of density fluctuations before the L-H transition for Hydrogen and Deuterium plasmas in the DIII-D tokamak using the BOUT++ code

    NASA Astrophysics Data System (ADS)

    Wang, Y. M.; Xu, X. Q.; Yan, Z.; Mckee, G. R.; Grierson, B. A.; Xia, T. Y.; Gao, X.

    2018-02-01

    A six-field two-fluid model has been used to simulate density fluctuations. The equilibrium is generated by experimental measurements for both Deuterium (D) and Hydrogen (H) plasmas at the lowest densities of DIII-D low to high confinement (L-H) transition experiments. In linear simulations, the unstable modes are found to be resistive ballooning modes with the most unstable mode number n  =  30 or k_θρ_i˜0.12 . The ion diamagnetic drift and E× B convection flow are balanced when the radial electric field (E r ) is calculated from the pressure profile without net flow. The curvature drift plays an important role in this stage. Two poloidally counter propagating modes are found in the nonlinear simulation of the D plasma at electron density n_e˜1.5×1019 m-3 near the separatrix while a single ion mode is found in the H plasma at the similar lower density, which are consistent with the experimental results measured by the beam emission spectroscopy (BES) diagnostic on the DIII-D tokamak. The frequency of the electron modes and the ion modes are about 40 kHz and 10 kHz respectively. The poloidal wave number k_θ is about 0.2 cm -1 (k_θρ_i˜0.05 ) for both ion and electron modes. The particle flux, ion and electron heat fluxes are  ˜3.5-6 times larger for the H plasma than the D plasma, which makes it harder to achieve H-mode for the same heating power. The change of the atomic mass number A from 2 to 1 using D plasma equilibrium make little difference on the flux. Increase the electric field will suppress the density fluctuation. The electric field scan and ion mass scan results show that the dual-mode results primarily from differences in the profiles rather than the ion mass.

  8. Embedding CLIPS in a database-oriented diagnostic system

    NASA Technical Reports Server (NTRS)

    Conway, Tim

    1990-01-01

    This paper describes the integration of C Language Production Systems (CLIPS) into a powerful portable maintenance aid (PMA) system used for flightline diagnostics. The current diagnostic target of the system is the Garrett GTCP85-180L, a gas turbine engine used as an Auxiliary Power Unit (APU) on some C-130 military transport aircraft. This project is a database oriented approach to a generic diagnostic system. CLIPS is used for 'many-to-many' pattern matching within the diagnostics process. Patterns are stored in database format, and CLIPS code is generated by a 'compilation' process on the database. Multiple CLIPS rule sets and working memories (in sequence) are supported and communication between the rule sets is achieved via the export and import commands. Work is continuing on using CLIPS in other portions of the diagnostic system and in re-implementing the diagnostic system in the Ada language.

  9. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    NASA Astrophysics Data System (ADS)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  10. Brain profiling and clinical-neuroscience.

    PubMed

    Peled, Avi

    2006-01-01

    The current psychiatric diagnostic system, the diagnostic statistic manual, has recently come under increasing criticism. The major reason for the shortcomings of the current psychiatric diagnosis is the lack of a scientific brain-related etiological knowledge about mental disorders. The advancement toward such knowledge is further hampered by the lack of a theoretical framework or "language" that translates clinical findings of mental disorders to brain disturbances and insufficiencies. Here such a theoretical construct is proposed based on insights from neuroscience and neural-computation models. Correlates between clinical manifestations and presumed neuronal network disturbances are proposed in the form of a practical diagnostic system titled "Brain Profiling". Three dimensions make-up brain profiling, "neural complexity disorders", "neuronal resilience insufficiency", and "context-sensitive processing decline". The first dimension relates to disturbances occurring to fast neuronal activations in the millisecond range, it incorporates connectivity and hierarchical imbalances appertaining typically to psychotic and schizophrenic clinical manifestations. The second dimension relates to disturbances that alter slower changes namely long-term synaptic modulations, and incorporates disturbances to optimization and constraint satisfactions within relevant neuronal circuitry. Finally, the level of internal representations related to personality disorders is presented by a "context-sensitive process decline" as the third dimension. For practical use of brain profiling diagnosis a consensual list of psychiatric clinical manifestations provides a "diagnostic input vector", clinical findings are coded 1 for "detection" and 0 for "non-detection", 0.5 is coded for "questionable". The entries are clustered according to their presumed neuronal dynamic relationships and coefficients determine their relevance to the specific related brain disturbance. Relevant equations calculate and normalize the different values attributed to relevant brain disturbances culminating in a three-digit estimation representing the three diagnostic dimensions. brain profiling has the promise for a future brain-related diagnosis. It offers testable predictions about the etiology of mental disorders because being brain-related it lends readily to brain imaging investigations. Being presented also as a one-point representation in a three-dimensional space, multiple follow-up diagnoses trace a trajectory representing an easy-to-see clinical history of the patient. Additional, more immediate, advantages involve reduced stigma because it relaters the disorder to the brain not the person, in addition the three-digit diagnostic code is clinically informative unlike the DSM codes that have no clinical relevance. To conclude, brain profiling diagnosis of mental disorders could be a bold new step toward a "clinical-neuroscience" substituting "psychiatry".

  11. Characterization of microwave plasma in a multicusp using 2D emission based tomography: Bessel modes and wave absorption

    NASA Astrophysics Data System (ADS)

    Rathore, Kavita; Bhattacharjee, Sudeep; Munshi, Prabhat

    2017-06-01

    A tomographic method based on the Fourier transform is used for characterizing a microwave plasma in a multicusp (MC), in order to obtain 2D distribution of plasma emissions, plasma (electron) density (Ne) and temperature (Te). The microwave plasma in the MC is characterized as a function of microwave power, gas pressure, and axial distance. The experimentally obtained 2D emission profiles show that the plasma emissions are generated in a circular ring shape. There are usually two bright rings, one at the plasma core and another near the boundary. The experimental results are validated using a numerical code that solves Maxwell's equations inside a waveguide filled with a plasma in a magnetic field, with collisions included. It is inferred that the dark and bright circular ring patterns are a result of superposition of Bessel modes (TE11 and TE21) of the wave electric field inside the plasma filled MC, which are in reasonable agreement with the plasma emission profiles. The tomographically obtained Ne and Te profiles indicate higher densities in the plasma core (˜1010 cm-3) and enhanced electron temperature in the ECR region (˜13 eV), which are in agreement with earlier results using a Langmuir probe and optical emission spectroscopy (OES) diagnostics.

  12. Laser Ionization Studies of Hydrocarbon Flames.

    NASA Astrophysics Data System (ADS)

    Bernstein, Jeffrey Scott

    Resonance-enhanced multiphoton ionization (REMPI) and laser induced fluorescence (LIF) are applied as laser based flame diagnostics for studies of hydrocarbon combustion chemistry. rm CH_4/O_2, C _2H_4/O_2, and rm C_2H_6/O_2 low pressure ( ~20 Torr), stoichiometric burner stabilized flat flames are studied. Density profiles of intermediate flame species, existing at ppm concentrations, are mapped out as a function of distance from the burner head. Profiles resulting from REMPI and LIF detection are obtained for HCO, CH_3, H, O, OH, CH, and CO flame radicals. The above flame systems are computer modeled against currently accepted combustion mechanisms using the Chemkin and Premix flame codes developed at Sandia National Laboratories. The modeled profile densities show good agreement with the experimental results of the CH_4/O_2 flame system, thus confirming the current C1 kinetic flame mechanism. Discrepancies between experimental and modeled results are found with the C2 flames. These discrepancies are partially amended by modifying the rate constant of the rm C_2H_3+rm O_2 to H_2CO + HCO reaction. The modeled results computed with the modified rate constant strongly suggest that the kinetics of several or possibly many reactions in the C2 mechanism need refinement.

  13. New detection system and signal processing for the tokamak ISTTOK heavy ion beam diagnostic.

    PubMed

    Henriques, R B; Nedzelskiy, I S; Malaquias, A; Fernandes, H

    2012-10-01

    The tokamak ISTTOK havy ion beam diagnostic (HIBD) operates with a multiple cell array detector (MCAD) that allows for the plasma density and the plasma density fluctuations measurements simultaneously at different sampling volumes across the plasma. To improve the capability of the plasma density fluctuations investigations, a new detection system and new signal conditioning amplifier have been designed and tested. The improvements in MCAD design are presented which allow for nearly complete suppression of the spurious plasma background signal by applying a biasing potential onto special electrodes incorporated into MCAD. The new low cost and small size transimpedance amplifiers are described with the parameters of 400 kHz, 10(7) V/A, 0.4 nA of RMS noise, adequate for the plasma density fluctuations measurements.

  14. Error floor behavior study of LDPC codes for concatenated codes design

    NASA Astrophysics Data System (ADS)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  15. Evidence-based rules from family practice to inform family practice; the learning healthcare system case study on urinary tract infections.

    PubMed

    Soler, Jean K; Corrigan, Derek; Kazienko, Przemyslaw; Kajdanowicz, Tomasz; Danger, Roxana; Kulisiewicz, Marcin; Delaney, Brendan

    2015-05-16

    Analysis of encounter data relevant to the diagnostic process sourced from routine electronic medical record (EMR) databases represents a classic example of the concept of a learning healthcare system (LHS). By collecting International Classification of Primary Care (ICPC) coded EMR data as part of the Transition Project from Dutch and Maltese databases (using the EMR TransHIS), data mining algorithms can empirically quantify the relationships of all presenting reasons for encounter (RfEs) and recorded diagnostic outcomes. We have specifically looked at new episodes of care (EoC) for two urinary system infections: simple urinary tract infection (UTI, ICPC code: U71) and pyelonephritis (ICPC code: U70). Participating family doctors (FDs) recorded details of all their patient contacts in an EoC structure using the ICPC, including RfEs presented by the patient, and the FDs' diagnostic labels. The relationships between RfEs and episode titles were studied using probabilistic and data mining methods as part of the TRANSFoRm project. The Dutch data indicated that the presence of RfE's "Cystitis/Urinary Tract Infection", "Dysuria", "Fear of UTI", "Urinary frequency/urgency", "Haematuria", "Urine symptom/complaint, other" are all strong, reliable, predictors for the diagnosis "Cystitis/Urinary Tract Infection" . The Maltese data indicated that the presence of RfE's "Dysuria", "Urinary frequency/urgency", "Haematuria" are all strong, reliable, predictors for the diagnosis "Cystitis/Urinary Tract Infection". The Dutch data indicated that the presence of RfE's "Flank/axilla symptom/complaint", "Dysuria", "Fever", "Cystitis/Urinary Tract Infection", "Abdominal pain/cramps general" are all strong, reliable, predictors for the diagnosis "Pyelonephritis" . The Maltese data set did not present any clinically and statistically significant predictors for pyelonephritis. We describe clinically and statistically significant diagnostic associations observed between UTIs and pyelonephritis presenting as a new problem in family practice, and all associated RfEs, and demonstrate that the significant diagnostic cues obtained are consistent with the literature. We conclude that it is possible to generate clinically meaningful diagnostic evidence from electronic sources of patient data.

  16. Frame-Transfer Gating Raman Spectroscopy for Time-Resolved Multiscalar Combustion Diagnostics

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Fischer, David G.; Kojima, Jun

    2011-01-01

    Accurate experimental measurement of spatially and temporally resolved variations in chemical composition (species concentrations) and temperature in turbulent flames is vital for characterizing the complex phenomena occurring in most practical combustion systems. These diagnostic measurements are called multiscalar because they are capable of acquiring multiple scalar quantities simultaneously. Multiscalar diagnostics also play a critical role in the area of computational code validation. In order to improve the design of combustion devices, computational codes for modeling turbulent combustion are often used to speed up and optimize the development process. The experimental validation of these codes is a critical step in accepting their predictions for engine performance in the absence of cost-prohibitive testing. One of the most critical aspects of setting up a time-resolved stimulated Raman scattering (SRS) diagnostic system is the temporal optical gating scheme. A short optical gate is necessary in order for weak SRS signals to be detected with a good signal- to-noise ratio (SNR) in the presence of strong background optical emissions. This time-synchronized optical gating is a classical problem even to other spectroscopic techniques such as laser-induced fluorescence (LIF) or laser-induced breakdown spectroscopy (LIBS). Traditionally, experimenters have had basically two options for gating: (1) an electronic means of gating using an image intensifier before the charge-coupled-device (CCD), or (2) a mechanical optical shutter (a rotary chopper/mechanical shutter combination). A new diagnostic technology has been developed at the NASA Glenn Research Center that utilizes a frame-transfer CCD sensor, in conjunction with a pulsed laser and multiplex optical fiber collection, to realize time-resolved Raman spectroscopy of turbulent flames that is free from optical background noise (interference). The technology permits not only shorter temporal optical gating (down to <1 s, in principle), but also higher optical throughput, thus resulting in a substantial increase in measurement SNR.

  17. Recent improvements of the JET lithium beam diagnostica)

    NASA Astrophysics Data System (ADS)

    Brix, M.; Dodt, D.; Dunai, D.; Lupelli, I.; Marsen, S.; Melson, T. F.; Meszaros, B.; Morgan, P.; Petravich, G.; Refy, D. I.; Silva, C.; Stamp, M.; Szabolics, T.; Zastrow, K.-D.; Zoletnik, S.; JET-EFDA Contributors

    2012-10-01

    A 60 kV neutral lithium diagnostic beam probes the edge plasma of JET for the measurement of electron density profiles. This paper describes recent enhancements of the diagnostic setup, new procedures for calibration and protection measures for the lithium ion gun during massive gas puffs for disruption mitigation. New light splitting optics allow in parallel beam emission measurements with a new double entrance slit CCD spectrometer (spectrally resolved) and a new interference filter avalanche photodiode camera (fast density and fluctuation studies).

  18. Utilization Trends in Diagnostic Imaging for a Commercially Insured Population: A Study of Massachusetts Residents 2009 to 2013.

    PubMed

    Flaherty, Stephen; Mortele, Koenraad J; Young, Gary J

    2018-06-01

    To report utilization trends in diagnostic imaging among commercially insured Massachusetts residents from 2009 to 2013. Current Procedural Terminology codes were used to identify diagnostic imaging claims in the Massachusetts All-Payer Claims Database for the years 2009 to 2013. We reported utilization and spending annually by imaging modality using total claims, claims per 1,000 individuals, total expenditures, and average per claim payments. The number of diagnostic imaging claims per insured MA resident increased only 0.6% from 2009 to 2013, whereas nonradiology claims increased by 6% annually. Overall diagnostic imaging expenditures, adjusted for inflation, were 27% lower in 2009 than 2013, compared with an 18% increase in nonimaging expenditures. Average payments per claim were lower in 2013 than 2009 for all modalities except nuclear medicine. Imaging procedure claims per 1,000 MA residents increased from 2009 to 2013 by 13% in MRI, from 147 to 166; by 17% in ultrasound, from 453 to 530; and by 12% in radiography (x-ray), from 985 to 1,100. However, CT claims per 1,000 fell by 37%, from 341 to 213, and nuclear medicine declined 57%, from 89 claims per 1,000 to 38. Diagnostic imaging utilization exhibited negligible growth over the study period. Diagnostic imaging expenditures declined, largely the result of falling payments per claim in most imaging modalities, in contrast with increased utilization and spending on nonimaging services. Utilization of MRI, ultrasound, and x-ray increased from 2009 to 2013, whereas CT and nuclear medicine use decreased sharply, although CT was heavily impacted by billing code changes. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  19. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  20. Modification and benchmarking of MCNP for low-energy tungsten spectra.

    PubMed

    Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M

    2000-12-01

    The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.

  1. Constructing a Pre-Emptive System Based on a Multidimentional Matrix and Autocompletion to Improve Diagnostic Coding in Acute Care Hospitals.

    PubMed

    Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice

    2016-01-01

    Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.

  2. Temperature Measurements in Compressed and Uncompressed SPECTOR Plasmas at General Fusion

    NASA Astrophysics Data System (ADS)

    Young, William; Carter, Neil; Howard, Stephen; Carle, Patrick; O'Shea, Peter; Fusion Team, General

    2017-10-01

    Accurate temperature measurements are critical to establishing the behavior of General Fusion's SPECTOR plasma injector, both before and during compression. As compression tests impose additional constraints on diagnostic access to the plasma, a two-color, filter-based soft x-ray electron temperature diagnostic has been implemented. Ion Doppler spectroscopy measurements also provide impurity ion temperatures on compression tests. The soft x-ray and ion Doppler spectroscopy measurements are being validated against a Thomson scattering system on an uncompressed version of SPECTOR with more diagnostic access. The multipoint Thomson scattering diagnostic also provides up to a six point temperature and density profile, with the density measurements validated against a far infrared interferometer. Temperatures above 300 eV have been demonstrated to be sustained for over 500 microseconds in uncompressed plasmas. Optimization of soft x-ray filters is ongoing, in order to balance blocking of impurity line radiation with signal strength.

  3. UCLA-LANL Reanalysis Project

    NASA Astrophysics Data System (ADS)

    Shprits, Y.; Chen, Y.; Friedel, R.; Kondrashov, D.; Ni, B.; Subbotin, D.; Reeves, G.; Ghil, M.

    2009-04-01

    We present first results of the UCLA-LANL Reanalysis Project. Radiation belt relativistic electron Phase Space Density is obtained using the data assimilative VERB code combined with observations from GEO, CRRES, and Akebono data. Reanalysis of data shows the pronounced peaks in the phase space density and pronounced dropouts of fluxes during the main phase of a storm. The results of the reanalysis are discussed and compared to the simulations with the recently developed VERB 3D code.

  4. Modeling Laser-Driven Laboratory Astrophysics Experiments Using the CRASH Code

    NASA Astrophysics Data System (ADS)

    Grosskopf, Michael; Keiter, P.; Kuranz, C. C.; Malamud, G.; Trantham, M.; Drake, R.

    2013-06-01

    Laser-driven, laboratory astrophysics experiments can provide important insight into the physical processes relevant to astrophysical systems. The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density laboratory astrophysics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. The CRASH model has been used on many applications including: radiative shocks, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL) collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  5. Mapping of hydrothermally altered rocks using airborne multispectral scanner data, Marysvale, Utah, mining district

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Jones, O.D.

    1983-01-01

    Multispectral data covering an area near Marysvale, Utah, collected with the airborne National Aeronautics and Space Administration (NASA) 24-channel Bendix multispectral scanner, were analyzed to detect areas of hydrothermally altered, potentially mineralized rocks. Spectral bands were selected for analysis that approximate those of the Landsat 4 Thematic Mapper and which are diagnostic of the presence of hydrothermally derived products. Hydrothermally altered rocks, particularly volcanic rocks affected by solutions rich in sulfuric acid, are commonly characterized by concentrations of argillic minerals such as alunite and kaolinite. These minerals are important for identifying hydrothermally altered rocks in multispectral images because they have intense absorption bands centered near a wavelength of 2.2 ??m. Unaltered volcanic rocks commonly do not contain these minerals and hence do not have the absorption bands. A color-composite image was constructed using the following spectral band ratios: 1.6??m/2.2??m, 1.6??m/0.48??m, and 0.67??m/1.0??m. The particular bands were chosen to emphasize the spectral contrasts that exist for argillic versus non-argillic rocks, limonitic versus nonlimonitic rocks, and rocks versus vegetation, respectively. The color-ratio composite successfully distinguished most types of altered rocks from unaltered rocks. Some previously unrecognized areas of hydrothermal alteration were mapped. The altered rocks included those having high alunite and/or kaolinite content, siliceous rocks containing some kaolinite, and ash-fall tuffs containing zeolitic minerals. The color-ratio-composite image allowed further division of these rocks into limonitic and nonlimonitic phases. The image did not allow separation of highly siliceous or hematitically altered rocks containing no clays or alunite from unaltered rocks. A color-coded density slice image of the 1.6??m/2.2??m band ratio allowed further discrimination among the altered units. Areas containing zeolites and some ash-fall tuffs containing montmorillonite were readily recognized on the color-coded density slice as having less intense 2.2-??m absorption than areas of highly altered rocks. The areas of most intense absorption, as depicted in the color-coded density slice, are dominated by highly altered rocks containing large amounts of alunite and kaolinite. These areas form an annulus, approximately 10 km in diameter, which surrounds a quartz monzonite intrusive body of Miocene age. The patterns of most intense alteration are interpreted as the remnants of paleohydrothermal convective cells set into motion during the emplacement of the central intrusive body. ?? 1983.

  6. Accumulate-Repeat-Accumulate-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Thorpe, Jeremy

    2007-01-01

    Accumulate-repeat-accumulate-accumulate (ARAA) codes have been proposed, inspired by the recently proposed accumulate-repeat-accumulate (ARA) codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. ARAA codes can be regarded as serial turbolike codes or as a subclass of low-density parity-check (LDPC) codes, and, like ARA codes they have projected graph or protograph representations; these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The objective in proposing ARAA codes as a subclass of ARA codes was to enhance the error-floor performance of ARA codes while maintaining simple encoding structures and low maximum variable node degree.

  7. Synthetic observations of molecular clouds in a galactic centre environment - I. Studying maps of column density and integrated intensity

    NASA Astrophysics Data System (ADS)

    Bertram, Erik; Glover, Simon C. O.; Clark, Paul C.; Ragan, Sarah E.; Klessen, Ralf S.

    2016-02-01

    We run numerical simulations of molecular clouds, adopting properties similar to those found in the central molecular zone (CMZ) of the Milky Way. For this, we employ the moving mesh code AREPO and perform simulations which account for a simplified treatment of time-dependent chemistry and the non-isothermal nature of gas and dust. We perform simulations using an initial density of n0 = 103 cm-3 and a mass of 1.3 × 105 M⊙. Furthermore, we vary the virial parameter, defined as the ratio of kinetic and potential energy, α = Ekin/|Epot|, by adjusting the velocity dispersion. We set it to α = 0.5, 2.0 and 8.0, in order to analyse the impact of the kinetic energy on our results. We account for the extreme conditions in the CMZ and increase both the interstellar radiation field (ISRF) and the cosmic ray flux (CRF) by a factor of 1000 compared to the values found in the solar neighbourhood. We use the radiative transfer code RADMC-3D to compute synthetic images in various diagnostic lines. These are [C II] at 158 μm, [O I] (145 μm), [O I] (63 μm), 12CO (J = 1 → 0) and 13CO (J = 1 → 0) at 2600 and 2720 μm, respectively. When α is large, the turbulence disperses much of the gas in the cloud, reducing its mean density and allowing the ISRF to penetrate more deeply into the cloud's interior. This significantly alters the chemical composition of the cloud, leading to the dissociation of a significant amount of the molecular gas. On the other hand, when α is small, the cloud remains compact, allowing more of the molecular gas to survive. We show that in each case the atomic tracers accurately reflect most of the physical properties of both the H2 and the total gas of the cloud and that they provide a useful alternative to molecular lines when studying the interstellar medium in the CMZ.

  8. Economic incentives and diagnostic coding in a public health care system.

    PubMed

    Anthun, Kjartan Sarheim; Bjørngaard, Johan Håkon; Magnussen, Jon

    2017-03-01

    We analysed the association between economic incentives and diagnostic coding practice in the Norwegian public health care system. Data included 3,180,578 hospital discharges in Norway covering the period 1999-2008. For reimbursement purposes, all discharges are grouped in diagnosis-related groups (DRGs). We examined pairs of DRGs where the addition of one or more specific diagnoses places the patient in a complicated rather than an uncomplicated group, yielding higher reimbursement. The economic incentive was measured as the potential gain in income by coding a patient as complicated, and we analysed the association between this gain and the share of complicated discharges within the DRG pairs. Using multilevel linear regression modelling, we estimated both differences between hospitals for each DRG pair and changes within hospitals for each DRG pair over time. Over the whole period, a one-DRG-point difference in price was associated with an increased share of complicated discharges of 14.2 (95 % confidence interval [CI] 11.2-17.2) percentage points. However, a one-DRG-point change in prices between years was only associated with a 0.4 (95 % CI [Formula: see text] to 1.8) percentage point change of discharges into the most complicated diagnostic category. Although there was a strong increase in complicated discharges over time, this was not as closely related to price changes as expected.

  9. Accuracy and Completeness of Clinical Coding Using ICD-10 for Ambulatory Visits

    PubMed Central

    Horsky, Jan; Drucker, Elizabeth A.; Ramelson, Harley Z.

    2017-01-01

    This study describes a simulation of diagnostic coding using an EHR. Twenty-three ambulatory clinicians were asked to enter appropriate codes for six standardized scenarios with two different EHRs. Their interactions with the query interface were analyzed for patterns and variations in search strategies and the resulting sets of entered codes for accuracy and completeness. Just over a half of entered codes were appropriate for a given scenario and about a quarter were omitted. Crohn’s disease and diabetes scenarios had the highest rate of inappropriate coding and code variation. The omission rate was higher for secondary than for primary visit diagnoses. Codes for immunization, dialysis dependence and nicotine dependence were the most often omitted. We also found a high rate of variation in the search terms used to query the EHR for the same diagnoses. Changes to the training of clinicians and improved design of EHR query modules may lower the rate of inappropriate and omitted codes. PMID:29854158

  10. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  11. Thomson scattering diagnostic on the Compact Toroidal Hybrid Experiment

    NASA Astrophysics Data System (ADS)

    Traverso, Peter; Maurer, D. A.; Ennis, D. A.; Hartwell, G. J.

    2016-10-01

    A Thomson scattering system is being commissioned for the non-axisymmetric plasmas of the Compact Toroidal Hybrid (CTH), a five-field period current-carrying torsatron. The system takes a single point measurement at the magnetic axis to both calibrate the two- color soft x-ray Te system and serve as an additional diagnostic for the V3FIT 3D equilibrium reconstruction code. A single point measurement will reduce the uncertainty in the reconstructed peak pressure by an order of magnitude for both current-carrying plasmas and future gyrotron-heated stellarator plasmas. The beam, generated by a frequency doubled Continuum 2 J, Nd:YaG laser, is passed vertically through an entrance Brewster window and a two-aperture optical baffle system to minimize stray light. The beam line propagates 8 m to the CTH device mid-plane with the beam diameter < 3 mm inside the plasma volume. Thomson scattered light is collected by two adjacent f/2 plano-convex condenser lenses and focused onto a custom fiber bundle. The fiber is then re-bundled and routed to a Holospec f/1.8 spectrograph to collect the red-shifted scattered light from 535-565 nm. The system has been designed to measure plasmas with core Te of 100 to 200 eV and densities of 5 ×1018 to 5 ×1019 m-3. Work supported by USDOE Grant DE-FG02-00ER54610.

  12. Thomson scattering diagnostic on the Compact Toroidal Hybrid Experiment

    NASA Astrophysics Data System (ADS)

    Traverso, P. J.; Ennis, D. A.; Hartwell, G. J.; Kring, J. D.; Maurer, D. A.

    2017-10-01

    A Thomson scattering system is being commissioned for the non-axisymmetric plasmas of the Compact Toroidal Hybrid (CTH), a five-field period current-carrying torsatron. The system takes a single point measurement at the magnetic axis to both calibrate the two-color soft x-ray Te system and serve as an additional diagnostic for the V3FIT 3D equilibrium reconstruction code. A single point measurement will reduce the uncertainty in the reconstructed peak pressure by an order of magnitude for both current-carrying plasmas and future gyrotron-heated stellarator plasmas. The beam, generated by a frequency doubled Continuum 2 J, Nd:YAG laser, is passed vertically through an entrance Brewster window and a two-aperture optical baffle system to minimize stray light. Thomson scattered light is collected by two adjacent f/2 plano-convex condenser lenses and routed via a fiber bundle through a Holospec f/1.8 spectrograph. The red-shifted scattered light from 533-563 nm will be collected by an array of Hamamatsu H11706-40 PMTs. The system has been designed to measure plasmas with core Te of 100 to 200 eV and densities of 5 ×1018 to 5 ×1019 m-3. Stray light and calibration data for a single wavelength channel will be presented. This work is supported by U.S. Department of Energy Grant No. DE-FG02-00ER54610.

  13. Determining the forsterite abundance of the dust around asymptotic giant branch stars

    NASA Astrophysics Data System (ADS)

    de Vries, B. L.; Min, M.; Waters, L. B. F. M.; Blommaert, J. A. D. L.; Kemper, F.

    2010-06-01

    Aims: We present a diagnostic tool to determine the abundance of the crystalline silicate forsterite in AGB stars surrounded by a thick shell of silicate dust. Using six infrared spectra of high mass-loss oxygen rich AGB stars we obtain the forsterite abundance of their dust shells. Methods: We use a monte carlo radiative transfer code to calculate infrared spectra of dust enshrouded AGB stars. We vary the dust composition, mass-loss rate and outer radius. We focus on the strength of the 11.3 and the 33.6 μm forsterite bands, that probe the most recent (11.3 μm) and older (33.6 μm) mass-loss history of the star. Simple diagnostic diagrams are derived, allowing direct comparison to observed band strengths. Results: Our analysis shows that the 11.3 μm forsterite band is a robust indicator for the forsterite abundance of the current mass-loss period for AGB stars with an optically thick dust shell. The 33.6 μm band of forsterite is sensitive to changes in the density and the geometry of the emitting dust shell, and so a less robust indicator. Applying our method to six high mass-loss rate AGB stars shows that AGB stars can have forsterite abundances of 12% by mass and higher, which is more than the previously found maximum abundance of 5%.

  14. High-Pressure Gaseous Burner (HPGB) Facility Completed for Quantitative Laser Diagnostics Calibration

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet

    2002-01-01

    A gas-fueled high-pressure combustion facility with optical access, which was developed over the last 2 years, has just been completed. The High Pressure Gaseous Burner (HPGB) rig at the NASA Glenn Research Center can operate at sustained pressures up to 60 atm with a variety of gaseous fuels and liquid jet fuel. The facility is unique as it is the only continuous-flow, hydrogen-capable, 60-atm rig in the world with optical access. It will provide researchers with new insights into flame conditions that simulate the environment inside the ultra-high-pressure-ratio combustion chambers of tomorrow's advanced aircraft engines. The facility provides optical access to the flame zone, enabling the calibration of nonintrusive optical diagnostics to measure chemical species and temperature. The data from the HPGB rig enables the validation of numerical codes that simulate gas turbine combustors, such as the National Combustor Code (NCC). The validation of such numerical codes is often best achieved with nonintrusive optical diagnostic techniques that meet these goals: information-rich (multispecies) and quantitative while providing good spatial and time resolution. Achieving these goals is a challenge for most nonintrusive optical diagnostic techniques. Raman scattering is a technique that meets these challenges. Raman scattering occurs when intense laser light interacts with molecules to radiate light at a shifted wavelength (known as the Raman shift). This shift in wavelength is unique to each chemical species and provides a "fingerprint" of the different species present. The facility will first be used to gather a comprehensive data base of laser Raman spectra at high pressures. These calibration data will then be used to quantify future laser Raman measurements of chemical species concentration and temperature in this facility and other facilities that use Raman scattering.

  15. Coding of Barrett's oesophagus with high-grade dysplasia in national administrative databases: a population-based cohort study.

    PubMed

    Chadwick, Georgina; Varagunam, Mira; Brand, Christian; Riley, Stuart A; Maynard, Nick; Crosby, Tom; Michalowski, Julie; Cromwell, David A

    2017-06-09

    The International Classification of Diseases 10th Revision (ICD-10) system used in the English hospital administrative database (Hospital Episode Statistics (HES)) does not contain a specific code for oesophageal high-grade dysplasia (HGD). The aim of this paper was to examine how patients with HGD were coded in HES and whether it was done consistently. National population-based cohort study of patients with newly diagnosed with HGD in England. The study used data collected prospectively as part of the National Oesophago-Gastric Cancer Audit (NOGCA). These records were linked to HES to investigate the pattern of ICD-10 codes recorded for these patients at the time of diagnosis. All patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014 in England, who had data submitted to the NOGCA. The main outcome assessed was the pattern of primary and secondary ICD-10 diagnostic codes recorded in the HES records at endoscopy at the time of diagnosis of HGD. Among 452 patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014, Barrett's oesophagus was the only condition coded in 200 (44.2%) HES records. Records for 59 patients (13.1%) contained no oesophageal conditions. The remaining 193 patients had various diagnostic codes recorded, 93 included a diagnosis of Barrett's oesophagus and 57 included a diagnosis of oesophageal/gastric cardia cancer. HES is not suitable to support national studies looking at the management of HGD. This is one reason for the UK to adopt an extended ICD system (akin to ICD-10-CM). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. A recoverable gas-cell diagnostic for the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ratkiewicz, A., E-mail: ratkiewicz1@llnl.gov; Berzak Hopkins, L.; Bleuel, D. L.

    2016-11-15

    The high-fluence neutron spectrum produced by the National Ignition Facility (NIF) provides an opportunity to measure the activation of materials by fast-spectrum neutrons. A new large-volume gas-cell diagnostic has been designed and qualified to measure the activation of gaseous substances at the NIF. This in-chamber diagnostic is recoverable, reusable and has been successfully fielded. Data from the qualification of the diagnostic have been used to benchmark an Monte Carlo N-Particle Transport Code simulation describing the downscattered neutron spectrum seen by the gas cell. We present early results from the use of this diagnostic to measure the activation of {sup nat}Xemore » and discuss future work to study the strength of interactions between plasma and nuclei.« less

  17. A recoverable gas-cell diagnostic for the National Ignition Facility

    DOE PAGES

    Ratkiewicz, A.; Hopkins, L. Berzak; Bleuel, D. L.; ...

    2016-08-22

    Here, the high-fluence neutron spectrum produced by the National Ignition Facility (NIF) provides an opportunity to measure the activation of materials by fast-spectrum neutrons. A new large-volume gas-cell diagnostic has been designed and qualified to measure the activation of gaseous substances at the NIF. This in-chamber diagnostic is recoverable, reusable and has been successfully fielded. Data from the qualification of the diagnostic have been used to benchmark an Monte Carlo N-Particle Transport Code simulation describing the downscattered neutron spectrum seen by the gas cell. We present early results from the use of this diagnostic to measure the activation of natXemore » and discuss future work to study the strength of interactions between plasma and nuclei.« less

  18. Progress on the Development of Low Pressure High Density Plasmas on the Helicon Plasma Experiment (HPX)

    NASA Astrophysics Data System (ADS)

    James, R. W.; Chamberlin, A.; Azzari, P.; Crilly, P.; Emami, T.; Hopson, J.; Karama, J.; Green, A.; Paolino, R. N.; Sandri, E.; Turk, J.; Wicke, M.; Cgapl Team

    2017-10-01

    The small Helicon Plasma Experiment (HPX) at the Coast Guard Academy Plasma Lab (CGAPL), continues to progress toward utilizing the reputed high densities (1013 cm-3 and higher) at low pressure (.01 T) [1] of helicons, for eventual high temperature and density diagnostic development in future laboratory investigations. HPX is designed to create repeatedly stable plasmas ( 20-30 ns) induced by an RF frequency in the 10 to 70 MHz range. HPX has constructed a protected Langmuir probe where raw data will be collected, compared to the RF compensated probe and used to measure the plasma's density, temperature, and behavior during experiments. Our 2.5 J YAG laser Thomson Scattering system backed by a 32-channel Data Acquisition (DAQ) system is capable 12 bits of sampling precision at 2 MS/s for HPX plasma property investigations are being integrated into the existing diagnostics and control architecture. Progress on the construction of the RF coupling system, Helicon Mode development, and magnetic coils, along with observations from the Thomson Scattering, particle, and electromagnetic scattering diagnostics will be reported. Supported by U.S. DEPS Grant [HEL-JTO] PRWJFY17.

  19. Plasma kinetic effects on atomistic mix in one dimension and at structured interfaces (II)

    NASA Astrophysics Data System (ADS)

    Albright, Brian; Yin, Lin; Cooley, James; Haack, Jeffrey; Douglas, Melissa

    2017-10-01

    The Marble campaign seeks to develop a platform for studying mix evolution in turbulent, inhomogeneous, high-energy-density plasmas at the NIF. Marble capsules contain engineered CD foams, the pores of which are filled with hydrogen and tritium. During implosion, hydrodynamic stirring and plasma diffusivity mix tritium fuel into the surrounding CD plasma, leading to both DD and DT fusion neutron production. In this presentation, building upon prior work, kinetic particle-in-cell simulations using the VPIC code are used to examine kinetic effects on thermonuclear burn in Marble-like settings. Departures from Maxwellian distributions are observed near the interface and TN burn rates and inferred temperatures from synthetic neutron time of flight diagnostics are compared with those from treating the background species as Maxwellian. Work performed under the auspices of the U.S. DOE by the Los Alamos National Security, LLC Los Alamos National Laboratory and supported by the ASC and Science programs.

  20. Observation of quasi-coherent edge fluctuations in Ohmic plasmas on National Spherical Torus Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, Santanu; Diallo, A.; Zweben, S. J.

    A quasi-coherent edge density mode with frequency f{sub mode} ∼ 40 kHz is observed in Ohmic plasmas in National Spherical Torus Experiment using the gas puff imaging diagnostic. This mode is located predominantly just inside the separatrix, with a maximum fluctuation amplitude significantly higher than that of the broadband turbulence in the same frequency range. The quasi-coherent mode has a poloidal wavelength λ{sub pol} ∼ 16 cm and a poloidal phase velocity of V{sub pol} ∼ 4.9 ± 0.3 km s{sup −1} in the electron diamagnetic direction, which are similar to the characteristics expected from a linear drift-wave-like mode in the edge. This is the first observation of amore » quasi-coherent edge mode in an Ohmic diverted tokamak, and so may be useful for validating tokamak edge turbulence codes.« less

  1. The STPX Spheromak System: Recent Measurements and Observations

    NASA Astrophysics Data System (ADS)

    Williams, R. L.; Clark, J.; Richardson, M.; Williams, R. E.

    2016-10-01

    We present results of recent measurements made to characterize the plasma formed in the STPX* Spheromak plasma device installed at the Florida A. and M University. The toroidal plasma is formed using a pulsed cylindrical gun discharge and, when fully operational, is designed to approach a density of 1021 /m3 and electron temperatures in the range of 100-350 eV. The diagnostic devices used for these recent measurements include Langmuir probes, electrostatic triple probes, optical spectrometers, CCD detectors, laser probes and magnetic field coils. These probes have been tested using both a static and the pulsed discharges created in the device, and we report the latest measurements. The voltage and current profiles of the pulsed discharge as well as the pulsed magnetic field coils are discussed. Progress in modeling this spheromak using NIMROD and other simulation codes will be discussed. Our recent results of an ongoing study of the topology of magnetic helicity are presented in a separate poster. Spheromak Turbulent Physics Experiment.

  2. Defining hip fracture with claims data: outpatient and provider claims matter.

    PubMed

    Berry, S D; Zullo, A R; McConeghy, K; Lee, Y; Daiello, L; Kiel, D P

    2017-07-01

    Medicare claims are commonly used to identify hip fractures, but there is no universally accepted definition. We found that a definition using inpatient claims identified fewer fractures than a definition including outpatient and provider claims. Few additional fractures were identified by including inconsistent diagnostic and procedural codes at contiguous sites. Medicare claims data is commonly used in research studies to identify hip fractures, but there is no universally accepted definition of fracture. Our purpose was to describe potential misclassification when hip fractures are defined using Medicare Part A (inpatient) claims without considering Part B (outpatient and provider) claims and when inconsistent diagnostic and procedural codes occur at contiguous fracture sites (e.g., femoral shaft or pelvic). Participants included all long-stay nursing home residents enrolled in Medicare Parts A and B fee-for-service between 1/1/2008 and 12/31/2009 with follow-up through 12/31/2011. We compared the number of hip fractures identified using only Part A claims to (1) Part A plus Part B claims and (2) Part A and Part B claims plus discordant codes at contiguous fracture sites. Among 1,257,279 long-stay residents, 40,932 (3.2%) met the definition of hip fracture using Part A claims, and 41,687 residents (3.3%) met the definition using Part B claims. 4566 hip fractures identified using Part B claims would not have been captured using Part A claims. An additional 227 hip fractures were identified after considering contiguous fracture sites. When ascertaining hip fractures, a definition using outpatient and provider claims identified 11% more fractures than a definition with only inpatient claims. Future studies should publish their definition of fracture and specify if diagnostic codes from contiguous fracture sites were used.

  3. A Novel Method for Estimating Transgender Status Using Electronic Medical Records

    PubMed Central

    Roblin, Douglas; Barzilay, Joshua; Tolsma, Dennis; Robinson, Brandi; Schild, Laura; Cromwell, Lee; Braun, Hayley; Nash, Rebecca; Gerth, Joseph; Hunkeler, Enid; Quinn, Virginia P.; Tangpricha, Vin; Goodman, Michael

    2016-01-01

    Background We describe a novel algorithm for identifying transgender people and determining their male-to-female (MTF) or female-to-male (FTM) identity in electronic medical records (EMR) of an integrated health system. Methods A SAS program scanned Kaiser Permanente Georgia EMR from January 2006 through December 2014 for relevant diagnostic codes, and presence of specific keywords (e.g., “transgender” or “transsexual”) in clinical notes. Eligibility was verified by review of de-identified text strings containing targeted keywords, and if needed, by an additional in-depth review of records. Once transgender status was confirmed, FTM or MTF identity was assessed using a second SAS program and another round of text string reviews. Results Of 813,737 members, 271 were identified as possibly transgender: 137 through keywords only, 25 through diagnostic codes only, and 109 through both codes and keywords. Of these individuals, 185 (68%, 95% confidence interval [CI]: 62-74%) were confirmed as definitely transgender. The proportions (95% CIs) of definite transgender status among persons identified via keywords, diagnostic codes, and both were 45% (37-54%), 56% (35-75%), and 100% (96-100%), respectively. Of the 185 definitely transgender people, 99 (54%, 95% CI: 46-61%) were MTF, 84 (45%, 95% CI: 38-53%) were FTM. For two persons, gender identity remained unknown. Prevalence of transgender people (per 100,000 members) was 4.4 (95% CI: 2.6-7.4) in 2006 and 38.7 (95% CI: 32.4-46.2) in 2014. Conclusions The proposed method of identifying candidates for transgender health studies is low cost and relatively efficient. It can be applied in other similar health care systems. PMID:26907539

  4. Diagnosing collisions of magnetized, high energy density plasma flows using a combination of collective Thomson scattering, Faraday rotation, and interferometry (invited).

    PubMed

    Swadling, G F; Lebedev, S V; Hall, G N; Patankar, S; Stewart, N H; Smith, R A; Harvey-Thompson, A J; Burdiak, G C; de Grouchy, P; Skidmore, J; Suttle, L; Suzuki-Vidal, F; Bland, S N; Kwek, K H; Pickworth, L; Bennett, M; Hare, J D; Rozmus, W; Yuan, J

    2014-11-01

    A suite of laser based diagnostics is used to study interactions of magnetised, supersonic, radiatively cooled plasma flows produced using the Magpie pulse power generator (1.4 MA, 240 ns rise time). Collective optical Thomson scattering measures the time-resolved local flow velocity and temperature across 7-14 spatial positions. The scattering spectrum is recorded from multiple directions, allowing more accurate reconstruction of the flow velocity vectors. The areal electron density is measured using 2D interferometry; optimisation and analysis are discussed. The Faraday rotation diagnostic, operating at 1053 nm, measures the magnetic field distribution in the plasma. Measurements obtained simultaneously by these diagnostics are used to constrain analysis, increasing the accuracy of interpretation.

  5. Aeroacoustic Codes for Rotor Harmonic and BVI Noise. CAMRAD.Mod1/HIRES: Methodology and Users' Manual

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.

    1998-01-01

    This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.

  6. [Relevance of long non-coding RNAs in tumour biology].

    PubMed

    Nagy, Zoltán; Szabó, Diána Rita; Zsippai, Adrienn; Falus, András; Rácz, Károly; Igaz, Péter

    2012-09-23

    The discovery of the biological relevance of non-coding RNA molecules represents one of the most significant advances in contemporary molecular biology. It has turned out that a major fraction of the non-coding part of the genome is transcribed. Beside small RNAs (including microRNAs) more and more data are disclosed concerning long non-coding RNAs of 200 nucleotides to 100 kb length that are implicated in the regulation of several basic molecular processes (cell proliferation, chromatin functioning, microRNA-mediated effects, etc.). Some of these long non-coding RNAs have been associated with human tumours, including H19, HOTAIR, MALAT1, etc., the different expression of which has been noted in various neoplasms relative to healthy tissues. Long non-coding RNAs may represent novel markers of molecular diagnostics and they might even turn out to be targets of therapeutic intervention.

  7. CLUMPY: A code for γ-ray signals from dark matter structures

    NASA Astrophysics Data System (ADS)

    Charbonnier, Aldée; Combet, Céline; Maurin, David

    2012-03-01

    We present the first public code for semi-analytical calculation of the γ-ray flux astrophysical J-factor from dark matter annihilation/decay in the Galaxy, including dark matter substructures. The core of the code is the calculation of the line of sight integral of the dark matter density squared (for annihilations) or density (for decaying dark matter). The code can be used in three modes: i) to draw skymaps from the Galactic smooth component and/or the substructure contributions, ii) to calculate the flux from a specific halo (that is not the Galactic halo, e.g. dwarf spheroidal galaxies) or iii) to perform simple statistical operations from a list of allowed DM profiles for a given object. Extragalactic contributions and other tracers of DM annihilation (e.g. positrons, anti-protons) will be included in a second release.

  8. Periodontitis as a Modifiable Risk Factor for Dementia: A Nationwide Population-Based Cohort Study.

    PubMed

    Lee, Yao-Tung; Lee, Hsin-Chien; Hu, Chaur-Jongh; Huang, Li-Kai; Chao, Shu-Ping; Lin, Chia-Pei; Su, Emily Chia-Yu; Lee, Yi-Chen; Chen, Chu-Chieh

    2017-02-01

    To determine whether periodontitis is a modifiable risk factor for dementia. Prospective cohort study. National Health Insurance Research Database in Taiwan. Individuals aged 65 and older with periodontitis (n = 3,028) and an age- and sex-matched control group (n = 3,028). Individuals with periodontitis were compared age- and sex-matched controls with for incidence density and hazard ratio (HR) of new-onset dementia. Periodontitis was defined according to International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes 523.3-5 diagnosed by dentists. To ensure diagnostic validity, only those who had concurrently received antibiotic therapies, periodontal treatment other than scaling, or scaling more than twice per year performed by certified dentists were included. Dementia was defined according to ICD-9-CM codes 290.0-290.4, 294.1, 331.0-331.2. After adjustment for confounding factors, the risk of developing dementia was calculated to be higher for participants with periodontitis (HR = 1.16, 95% confidence interval = 1.01-1.32, P = .03) than for those without. Periodontitis is associated with greater risk of developing dementia. Periodontal infection is treatable, so it might be a modifiable risk factor for dementia. Clinicians must devote greater attention to this potential association in an effort to develop new preventive and therapeutic strategies for dementia. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  9. Atomic Data and Spectral Line Intensities for NI XVII

    NASA Technical Reports Server (NTRS)

    Bhatia, A. K.; Landi, E.

    2011-01-01

    Electron impact collision strengths, energy levels, oscillator strengths, and spontaneous radiative decay rates are calculated for Ni XVII. We include in the calculations the 23 lowest configurations, corresponding to 159 fine-structure levels: 3l3l', 3l4l0'' , and 3s5l0''' , with l,l' = s,p,d, l'' = s,p,d, f, and l''' = s,p,d. Collision strengths are calculated at five incident energies for all transitions at varying energies above the threshold of each transition. One additional energy, very close to the threshold of each transition, has also been included. Calculations have been carried out using the Flexible Atomic Code in the distorted wave approximation. Additional calculations have been performed with the University College London suite of codes for comparison. Excitation rate coefficients are calculated as a function of electron temperature by assuming a Maxwellian electron velocity distribution. Using the excitation rate coefficients and the radiative transition rates of the present work, statistical equilibrium equations for level populations are solved at electron densities covering the range of 10(exp 8) - 10(exp 14) / cubic cm and at an electron temperature of logT(sub e)e(K) = 6.5, corresponding to the maximum abundance of Ni XVII. Spectral line intensities are calculated, and their diagnostic relevance is discussed. This dataset will be made available in the next version of the CHIANTI database

  10. The effect of density fluctuations on electron cyclotron beam broadening and implications for ITER

    NASA Astrophysics Data System (ADS)

    Snicker, A.; Poli, E.; Maj, O.; Guidi, L.; Köhn, A.; Weber, H.; Conway, G.; Henderson, M.; Saibene, G.

    2018-01-01

    We present state-of-the-art computations of propagation and absorption of electron cyclotron waves, retaining the effects of scattering due to electron density fluctuations. In ITER, injected microwaves are foreseen to suppress neoclassical tearing modes (NTMs) by driving current at the q=2 and q=3/2 resonant surfaces. Scattering of the beam can spoil the good localization of the absorption and thus impair NTM control capabilities. A novel tool, the WKBeam code, has been employed here in order to investigate this issue. The code is a Monte Carlo solver for the wave kinetic equation and retains diffraction, full axisymmetric tokamak geometry, determination of the absorption profile and an integral form of the scattering operator which describes the effects of turbulent density fluctuations within the limits of the Born scattering approximation. The approach has been benchmarked against the paraxial WKB code TORBEAM and the full-wave code IPF-FDMC. In particular, the Born approximation is found to be valid for ITER parameters. In this paper, we show that the radiative transport of EC beams due to wave scattering in ITER is diffusive unlike in present experiments, thus causing up to a factor of 2-4 broadening in the absorption profile. However, the broadening depends strongly on the turbulence model assumed for the density fluctuations, which still has large uncertainties.

  11. Three-dimensional holoscopic image coding scheme using high-efficiency video coding with kernel-based minimum mean-square-error estimation

    NASA Astrophysics Data System (ADS)

    Liu, Deyang; An, Ping; Ma, Ran; Yang, Chao; Shen, Liquan; Li, Kai

    2016-07-01

    Three-dimensional (3-D) holoscopic imaging, also known as integral imaging, light field imaging, or plenoptic imaging, can provide natural and fatigue-free 3-D visualization. However, a large amount of data is required to represent the 3-D holoscopic content. Therefore, efficient coding schemes for this particular type of image are needed. A 3-D holoscopic image coding scheme with kernel-based minimum mean square error (MMSE) estimation is proposed. In the proposed scheme, the coding block is predicted by an MMSE estimator under statistical modeling. In order to obtain the signal statistical behavior, kernel density estimation (KDE) is utilized to estimate the probability density function of the statistical modeling. As bandwidth estimation (BE) is a key issue in the KDE problem, we also propose a BE method based on kernel trick. The experimental results demonstrate that the proposed scheme can achieve a better rate-distortion performance and a better visual rendering quality.

  12. LDPC coded OFDM over the atmospheric turbulence channel.

    PubMed

    Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A

    2007-05-14

    Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).

  13. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions

  14. Bounded-Angle Iterative Decoding of LDPC Codes

    NASA Technical Reports Server (NTRS)

    Dolinar, Samuel; Andrews, Kenneth; Pollara, Fabrizio; Divsalar, Dariush

    2009-01-01

    Bounded-angle iterative decoding is a modified version of conventional iterative decoding, conceived as a means of reducing undetected-error rates for short low-density parity-check (LDPC) codes. For a given code, bounded-angle iterative decoding can be implemented by means of a simple modification of the decoder algorithm, without redesigning the code. Bounded-angle iterative decoding is based on a representation of received words and code words as vectors in an n-dimensional Euclidean space (where n is an integer).

  15. Simulated performance of the optical Thomson scattering diagnostic designed for the National Ignition Facility

    DOE PAGES

    Ross, J. S.; Datte, P.; Divol, L.; ...

    2016-07-28

    An optical Thomson scattering diagnostic has been designed for the National Ignition Facility to characterize under-dense plasmas. Here, we report on the design of the system and the expected performance for different target configurations. The diagnostic is designed to spatially and temporally resolve the Thomson scattered light from laser driven targets. The diagnostic will collect scattered light from a 50 × 50 × 200 μm volume. The optical design allows operation with different probe laser wavelengths. A deep-UV probe beam (λ 0 = 210 nm) will be used to Thomson scatter from electron plasma densities of ~5 × 10 20more » cm -3 while a 3ω probe will be used for plasma densities of ~1 × 10 19 cm -3. The diagnostic package contains two spectrometers: the first to resolve Thomson scattering from ion acoustic wave fluctuations and the second to resolve scattering from electron plasma wave fluctuations. Expected signal levels relative to background will be presented for typical target configurations (hohlraums and a planar foil).« less

  16. Simulated performance of the optical Thomson scattering diagnostic designed for the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, J. S., E-mail: ross36@llnl.gov; Datte, P.; Divol, L.

    2016-11-15

    An optical Thomson scattering diagnostic has been designed for the National Ignition Facility to characterize under-dense plasmas. We report on the design of the system and the expected performance for different target configurations. The diagnostic is designed to spatially and temporally resolve the Thomson scattered light from laser driven targets. The diagnostic will collect scattered light from a 50 × 50 × 200 μm volume. The optical design allows operation with different probe laser wavelengths. A deep-UV probe beam (λ{sub 0} = 210 nm) will be used to Thomson scatter from electron plasma densities of ∼5 × 10{sup 20} cm{supmore » −3} while a 3ω probe will be used for plasma densities of ∼1 × 10{sup 19} cm{sup −3}. The diagnostic package contains two spectrometers: the first to resolve Thomson scattering from ion acoustic wave fluctuations and the second to resolve scattering from electron plasma wave fluctuations. Expected signal levels relative to background will be presented for typical target configurations (hohlraums and a planar foil).« less

  17. Comparison of measured and modelled negative hydrogen ion densities at the ECR-discharge HOMER

    NASA Astrophysics Data System (ADS)

    Rauner, D.; Kurutz, U.; Fantz, U.

    2015-04-01

    As the negative hydrogen ion density nH- is a key parameter for the investigation of negative ion sources, its diagnostic quantification is essential in source development and operation as well as for fundamental research. By utilizing the photodetachment process of negative ions, generally two different diagnostic methods can be applied: via laser photodetachment, the density of negative ions is measured locally, but only relatively to the electron density. To obtain absolute densities, the electron density has to be measured additionally, which induces further uncertainties. Via cavity ring-down spectroscopy (CRDS), the absolute density of H- is measured directly, however LOS-averaged over the plasma length. At the ECR-discharge HOMER, where H- is produced in the plasma volume, laser photodetachment is applied as the standard method to measure nH-. The additional application of CRDS provides the possibility to directly obtain absolute values of nH-, thereby successfully bench-marking the laser photodetachment system as both diagnostics are in good agreement. In the investigated pressure range from 0.3 to 3 Pa, the measured negative hydrogen ion density shows a maximum at 1 to 1.5 Pa and an approximately linear response to increasing input microwave powers from 200 up to 500 W. Additionally, the volume production of negative ions is 0-dimensionally modelled by balancing H- production and destruction processes. The modelled densities are adapted to the absolute measurements of nH- via CRDS, allowing to identify collisions of H- with hydrogen atoms (associative and non-associative detachment) to be the dominant loss process of H- in the plasma volume at HOMER. Furthermore, the characteristic peak of nH- observed at 1 to 1.5 Pa is identified to be caused by a comparable behaviour of the electron density with varying pressure, as ne determines the volume production rate via dissociative electron attachment to vibrationally excited hydrogen molecules.

  18. Recipes for Reading: A Teacher's Handbook for Diagnostic and Prescriptive Teaching, or the Reading Teacher's Cookbook.

    ERIC Educational Resources Information Center

    Moody, Barbara J., Ed.; And Others

    A coding system for categorizing reading skills was developed in order to provide manuals for each grade level (preprimer through 6) that would aid teachers in locating materials on a particular skill by page number in a specific text. A skill code key of the skills usually taught at a given reading grade level is based on specific basal test…

  19. Recipes for Reading: A Teacher's Handbook for Diagnostic and Prescriptive Teaching, or the Reading Teacher's "Cookbook."

    ERIC Educational Resources Information Center

    Moody, Barbara J., Ed.; And Others

    A coding system for categorizing reading skills was developed in order to provide manuals for each grade level (preprimer through 6) that would aid teachers in locating materials on a particular skill by page number in a specific text. A skill code key of the skills usually taught at a given reading grade level is based on specific basal test…

  20. Diagnostic practice of psychogenic nonepileptic seizures (PNES) in the pediatric setting.

    PubMed

    Wichaidit, Bianca T; Østergaard, John R; Rask, Charlotte U

    2015-01-01

    No formal guidelines for diagnosing psychogenic nonepileptic seizures (PNES) in children exist, and little is known about the clinical practice of diagnosing PNES in the pediatric setting. We therefore performed a national survey as a first step to document pediatricians' current diagnostic practice for PNES. A questionnaire was distributed to all pediatricians (n=64) working in the field of neuropediatrics and/or social pediatrics in the Danish hospital setting to uncover their use of terminology and of the International Classification of Diseases, 10th Revision (ICD-10) codes as well as their clinical diagnostic approach to pediatric PNES. The questionnaire included questions on 18 history and 24 paroxysmal event characteristics. The response rate was 95% (61/64). There was no consensus on which terminology and diagnostic codes to use. Five history characteristics (psychosocial stressors/trauma, sexual abuse, paroxysmal events typically occur in stressful situations, no effect of antiepileptic drugs, and physical abuse) and six paroxysmal event characteristics (resisted eyelid opening, avoidance/guarding behavior, paroxysmal events occur in the presence of others, closed eyes, rarely injury related to paroxysmal event, and absence of postictal change) were agreed to be very predictive of PNES by at least 50% of the pediatricians. Supplementary diagnostic tests such as blood chemistry measurements (e.g., blood glucose or acute phase reactants; i.e., white blood cell count and C-reactive protein) and electrocardiography were inconsistently used. Only 49% of the respondents reported to use video-electroencephalography (VEEG) frequently as part of their diagnostic procedure. To our knowledge, this is the first national survey that offers a systematic insight into the diagnostic practices for children with PNES in the hospital setting. The results demonstrate a need for clinical guidelines to improve and systematize the diagnostic approach for PNES in children. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  1. Validating diagnostic information on the Minimum Data Set in Ontario Hospital-based long-term care.

    PubMed

    Wodchis, Walter P; Naglie, Gary; Teare, Gary F

    2008-08-01

    Over 20 countries currently use the Minimum Data Set Resident Assessment Instrument (MDS) in long-term care settings for care planning, policy, and research purposes. A full assessment of the quality of the diagnostic information recorded on the MDS is lacking. The primary goal of this study was to examine the quality of diagnostic coding on the MDS. Subjects for this study were admitted to Ontario Complex Continuing Care Hospitals (CCC) directly from acute hospitals between April 1, 1997 and March 31, 2005 (n = 80,664). Encrypted unique identifiers, common across acute and CCC administrative databases, were used to link administrative records for patients in the sample. After linkage, each resident had 2 sources of diagnostic information: the acute discharge abstract database and the MDS. Using the discharge abstract database as the reference standard, we calculated the sensitivity for each of 43 MDS diagnoses. Compared with primary diagnoses coded in acute care abstracts, 12 of 43 MDS diagnoses attained a sensitivity of at least 0.80, including 7 of the 10 diagnoses with the highest prevalence as an acute care primary diagnosis before CCC admission. Although the sensitivity was high for many of the most prevalent conditions, important diagnostic information is missed increasing the potential for suboptimal clinical care. Emphasis needs to be put on improving information flow across care settings during patient transitions. Researchers should exercise caution when using MDS diagnoses to identify patient populations, particularly those shown to have low sensitivity in this study.

  2. Changing Utilization of Noninvasive Diagnostic Imaging Over 2 Decades: An Examination Family-Focused Analysis of Medicare Claims Using the Neiman Imaging Types of Service Categorization System.

    PubMed

    Rosman, David A; Duszak, Richard; Wang, Wenyi; Hughes, Danny R; Rosenkrantz, Andrew B

    2018-02-01

    The objective of our study was to use a new modality and body region categorization system to assess changing utilization of noninvasive diagnostic imaging in the Medicare fee-for-service population over a recent 20-year period (1994-2013). All Medicare Part B Physician Fee Schedule services billed between 1994 and 2013 were identified using Physician/Supplier Procedure Summary master files. Billed codes for diagnostic imaging were classified using the Neiman Imaging Types of Service (NITOS) coding system by both modality and body region. Utilization rates per 1000 beneficiaries were calculated for families of services. Among all diagnostic imaging modalities, growth was greatest for MRI (+312%) and CT (+151%) and was lower for ultrasound, nuclear medicine, and radiography and fluoroscopy (range, +1% to +31%). Among body regions, service growth was greatest for brain (+126%) and spine (+74%) imaging; showed milder growth (range, +18% to +67%) for imaging of the head and neck, breast, abdomen and pelvis, and extremity; and showed slight declines (range, -2% to -7%) for cardiac and chest imaging overall. The following specific imaging service families showed massive (> +100%) growth: cardiac CT, cardiac MRI, and breast MRI. NITOS categorization permits identification of temporal shifts in noninvasive diagnostic imaging by specific modality- and region-focused families, providing a granular understanding and reproducible analysis of global changes in imaging overall. Service family-level perspectives may help inform ongoing policy efforts to optimize imaging utilization and appropriateness.

  3. Hot prominence detected in the core of a coronal mass ejection. II. Analysis of the C III line detected by SOHO/UVCS

    NASA Astrophysics Data System (ADS)

    Jejčič, S.; Susino, R.; Heinzel, P.; Dzifčáková, E.; Bemporad, A.; Anzer, U.

    2017-11-01

    Context. We study the physics of erupting prominences in the core of coronal mass ejections (CMEs) and present a continuation of a previous analysis. Aims: We determine the kinetic temperature and microturbulent velocity of an erupting prominence embedded in the core of a CME that occurred on August 2, 2000 using the Ultraviolet Coronagraph and Spectrometer observations (UVCS) on board the Solar and Heliospheric Observatory (SOHO) simultaneously in the hydrogen Lα and C III lines. We develop the non-LTE (departures from the local thermodynamic equilibrium - LTE) spectral diagnostics based on Lα and Lβ measured integrated intensities to derive other physical quantities of the hot erupting prominence. Based on this, we synthesize the C III line intensity to compare it with observations. Methods: Our method is based on non-LTE modeling of eruptive prominences. We used a general non-LTE radiative-transfer code only for optically thin prominence points because optically thick points do not allow the direct determination of the kinetic temperature and microturbulence from the line profiles. The input parameters of the code were the kinetic temperature and microturbulent velocity derived from the Lα and C III line widths, as well as the integrated intensity of the Lα and Lβ lines. The code runs in three loops to compute the radial flow velocity, electron density, and effective thickness as the best fit to the Lα and Lβ integrated intensities within the accuracy defined by the absolute radiometric calibration of UVCS data. Results: We analyzed 39 observational points along the whole erupting prominence because for these points we found a solution for the kinetic temperature and microturbulent velocity. For these points we ran the non-LTE code to determine best-fit models. All models with τ0(Lα) ≤ 0.3 and τ0(C III) ≤ 0.3 were analyzed further, for which we computed the integrated intensity of the C III line using a two-level atom. The best agreement between computed and observed integrated intensity led to 30 optically thin points along the prominence. The results are presented as histograms of the kinetic temperature, microturbulent velocity, effective thickness, radial flow velocity, electron density, and gas pressure. We also show the relation between the microturbulence and kinetic temperature together with a scatter plot of computed versus observed C III integrated intensities and the ratio of the computed to observed C III integrated intensities versus kinetic temperature. Conclusions: The erupting prominence embedded in the CME is relatively hot with a low electron density, a wide range of effective thicknesses, a rather narrow range of radial flow velocities, and a microturbulence of about 25 km s-1. This analysis shows a disagreement between observed and synthetic intensities of the C III line, the reason for which most probably is that photoionization is neglected in calculations of the ionization equilibrium. Alternatively, the disagreement might be due to non-equilibrium processes.

  4. Use of an electronic medical record for the identification of research subjects with diabetes mellitus.

    PubMed

    Wilke, Russell A; Berg, Richard L; Peissig, Peggy; Kitchner, Terrie; Sijercic, Bozana; McCarty, Catherine A; McCarty, Daniel J

    2007-03-01

    Diabetes mellitus is a rapidly increasing and costly public health problem. Large studies are needed to understand the complex gene-environment interactions that lead to diabetes and its complications. The Marshfield Clinic Personalized Medicine Research Project (PMRP) represents one of the largest population-based DNA biobanks in the United States. As part of an effort to begin phenotyping common diseases within the PMRP, we now report on the construction of a diabetes case-finding algorithm using electronic medical record data from adult subjects aged > or =50 years living in one of the target PMRP ZIP codes. Based upon diabetic diagnostic codes alone, we observed a false positive case rate ranging from 3.0% (in subjects with the highest glycosylated hemoglobin values) to 44.4% (in subjects with the lowest glycosylated hemoglobin values). We therefore developed an improved case finding algorithm that utilizes diabetic diagnostic codes in combination with clinical laboratory data and medication history. This algorithm yielded an estimated prevalence of 24.2% for diabetes mellitus in adult subjects aged > or =50 years.

  5. A Pilot Study of a Computerized Decision Support System to Detect Invasive Fungal Infection in Pediatric Hematology/Oncology Patients.

    PubMed

    Bartlett, Adam; Goeman, Emma; Vedi, Aditi; Mostaghim, Mona; Trahair, Toby; O'Brien, Tracey A; Palasanthiran, Pamela; McMullan, Brendan

    2015-11-01

    Computerized decision support systems (CDSSs) can provide indication-specific antimicrobial recommendations and approvals as part of hospital antimicrobial stewardship (AMS) programs. The aim of this study was to assess the performance of a CDSS for surveillance of invasive fungal infections (IFIs) in an inpatient hematology/oncology cohort. Between November 1, 2012, and October 31, 2013, pediatric hematology/oncology inpatients diagnosed with an IFI were identified through an audit of the CDSS and confirmed by medical record review. The results were compared to hospital diagnostic-related group (DRG) coding for IFI throughout the same period. A total of 83 patients were prescribed systemic antifungals according to the CDSS for the 12-month period. The CDSS correctly identified 19 patients with IFI on medical record review, compared with 10 patients identified by DRG coding, of whom 9 were confirmed to have IFI on medical record review. CDSS was superior to diagnostic coding in detecting IFI in an inpatient pediatric hematology/oncology cohort. The functionality of CDSS lends itself to inpatient infectious diseases surveillance but depends on prescriber adherence.

  6. StarSmasher: Smoothed Particle Hydrodynamics code for smashing stars and planets

    NASA Astrophysics Data System (ADS)

    Gaburov, Evghenii; Lombardi, James C., Jr.; Portegies Zwart, Simon; Rasio, F. A.

    2018-05-01

    Smoothed Particle Hydrodynamics (SPH) is a Lagrangian particle method that approximates a continuous fluid as discrete nodes, each carrying various parameters such as mass, position, velocity, pressure, and temperature. In an SPH simulation the resolution scales with the particle density; StarSmasher is able to handle both equal-mass and equal number-density particle models. StarSmasher solves for hydro forces by calculating the pressure for each particle as a function of the particle's properties - density, internal energy, and internal properties (e.g. temperature and mean molecular weight). The code implements variational equations of motion and libraries to calculate the gravitational forces between particles using direct summation on NVIDIA graphics cards. Using a direct summation instead of a tree-based algorithm for gravity increases the accuracy of the gravity calculations at the cost of speed. The code uses a cubic spline for the smoothing kernel and an artificial viscosity prescription coupled with a Balsara Switch to prevent unphysical interparticle penetration. The code also implements an artificial relaxation force to the equations of motion to add a drag term to the calculated accelerations during relaxation integrations. Initially called StarCrash, StarSmasher was developed originally by Rasio.

  7. Polymerization of non-complementary RNA: systematic symmetric nucleotide exchanges mainly involving uracil produce mitochondrial RNA transcripts coding for cryptic overlapping genes.

    PubMed

    Seligmann, Hervé

    2013-03-01

    Usual DNA→RNA transcription exchanges T→U. Assuming different systematic symmetric nucleotide exchanges during translation, some GenBank RNAs match exactly human mitochondrial sequences (exchange rules listed in decreasing transcript frequencies): C↔U, A↔U, A↔U+C↔G (two nucleotide pairs exchanged), G↔U, A↔G, C↔G, none for A↔C, A↔G+C↔U, and A↔C+G↔U. Most unusual transcripts involve exchanging uracil. Independent measures of rates of rare replicational enzymatic DNA nucleotide misinsertions predict frequencies of RNA transcripts systematically exchanging the corresponding misinserted nucleotides. Exchange transcripts self-hybridize less than other gene regions, self-hybridization increases with length, suggesting endoribonuclease-limited elongation. Blast detects stop codon depleted putative protein coding overlapping genes within exchange-transcribed mitochondrial genes. These align with existing GenBank proteins (mainly metazoan origins, prokaryotic and viral origins underrepresented). These GenBank proteins frequently interact with RNA/DNA, are membrane transporters, or are typical of mitochondrial metabolism. Nucleotide exchange transcript frequencies increase with overlapping gene densities and stop densities, indicating finely tuned counterbalancing regulation of expression of systematic symmetric nucleotide exchange-encrypted proteins. Such expression necessitates combined activities of suppressor tRNAs matching stops, and nucleotide exchange transcription. Two independent properties confirm predicted exchanged overlap coding genes: discrepancy of third codon nucleotide contents from replicational deamination gradients, and codon usage according to circular code predictions. Predictions from both properties converge, especially for frequent nucleotide exchange types. Nucleotide exchanging transcription apparently increases coding densities of protein coding genes without lengthening genomes, revealing unsuspected functional DNA coding potential. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Malaria rapid diagnostic tests in elimination settings—can they find the last parasite?

    PubMed Central

    McMorrow, M. L.; Aidoo, M.; Kachur, S. P.

    2016-01-01

    Rapid diagnostic tests (RDTs) for malaria have improved the availability of parasite-based diagnosis throughout the malaria-endemic world. Accurate malaria diagnosis is essential for malaria case management, surveillance, and elimination. RDTs are inexpensive, simple to perform, and provide results in 15–20 min. Despite high sensitivity and specificity for Plasmodium falciparum infections, RDTs have several limitations that may reduce their utility in low-transmission settings: they do not reliably detect low-density parasitaemia (≤200 parasites/μL), many are less sensitive for Plasmodium vivax infections, and their ability to detect Plasmodium ovale and Plasmodium malariae is unknown. Therefore, in elimination settings, alternative tools with higher sensitivity for low-density infections (e.g. nucleic acid-based tests) are required to complement field diagnostics, and new highly sensitive and specific field-appropriate tests must be developed to ensure accurate diagnosis of symptomatic and asymptomatic carriers. As malaria transmission declines, the proportion of low-density infections among symptomatic and asymptomatic persons is likely to increase, which may limit the utility of RDTs. Monitoring malaria in elimination settings will probably depend on the use of more than one diagnostic tool in clinical-care and surveillance activities, and the combination of tools utilized will need to be informed by regular monitoring of test performance through effective quality assurance. PMID:21910780

  9. High-density digital recording

    NASA Technical Reports Server (NTRS)

    Kalil, F. (Editor); Buschman, A. (Editor)

    1985-01-01

    The problems associated with high-density digital recording (HDDR) are discussed. Five independent users of HDDR systems and their problems, solutions, and insights are provided as guidance for other users of HDDR systems. Various pulse code modulation coding techniques are reviewed. An introduction to error detection and correction head optimization theory and perpendicular recording are provided. Competitive tape recorder manufacturers apply all of the above theories and techniques and present their offerings. The methodology used by the HDDR Users Subcommittee of THIC to evaluate parallel HDDR systems is presented.

  10. Photonic entanglement-assisted quantum low-density parity-check encoders and decoders.

    PubMed

    Djordjevic, Ivan B

    2010-05-01

    I propose encoder and decoder architectures for entanglement-assisted (EA) quantum low-density parity-check (LDPC) codes suitable for all-optical implementation. I show that two basic gates needed for EA quantum error correction, namely, controlled-NOT (CNOT) and Hadamard gates can be implemented based on Mach-Zehnder interferometer. In addition, I show that EA quantum LDPC codes from balanced incomplete block designs of unitary index require only one entanglement qubit to be shared between source and destination.

  11. Parallel Subspace Subcodes of Reed-Solomon Codes for Magnetic Recording Channels

    ERIC Educational Resources Information Center

    Wang, Han

    2010-01-01

    Read channel architectures based on a single low-density parity-check (LDPC) code are being considered for the next generation of hard disk drives. However, LDPC-only solutions suffer from the error floor problem, which may compromise reliability, if not handled properly. Concatenated architectures using an LDPC code plus a Reed-Solomon (RS) code…

  12. Interpretation of two compact planetary nebulae, IC 4997 and NGC 6572, with aid of theoretical models.

    PubMed Central

    Hyung, S; Aller, L H

    1993-01-01

    Observations of two dense compact planetary nebulae secured with the Hamilton Echelle spectrograph at Lick Observatory combined with previously published UV spectra secured with the International Ultraviolet Explorer enable us to probe the electron densities and temperatures (plasma diagnostics) and ionic concentrations in these objects. The diagnostic diagrams show that no homogenous model will work for these nebulae. NGC 6572 may consist of an inner torordal ring of density 25,000 atoms/cm3 and an outer conical shell of density 10,000 atoms/cm3. The simplest model of IC 4997 suggests a thick inner shell with a density of about 107 atoms/cm3 and an outer envelope of density 10,000 atoms/cm3. The abundances of all elements heavier than He appear to be less than the solar values in NGC 6572, whereas He, C, N, and O may be more abundant in IC 4997 than in the sun. IC 4997 presents puzzling problems. PMID:11607347

  13. Implementation of the new multichannel X-mode edge density profile reflectometer for the ICRF antenna on ASDEX Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguiam, D. E., E-mail: daguiam@ipfn.tecnico.ulisboa.pt; Silva, A.; Carvalho, P. J.

    A new multichannel frequency modulated continuous-wave reflectometry diagnostic has been successfully installed and commissioned on ASDEX Upgrade to measure the plasma edge electron density profile evolution in front of the Ion Cyclotron Range of Frequencies (ICRF) antenna. The design of the new three-strap ICRF antenna integrates ten pairs (sending and receiving) of microwave reflectometry antennas. The multichannel reflectometer can use three of these to measure the edge electron density profiles up to 2 × 10{sup 19} m{sup −3}, at different poloidal locations, allowing the direct study of the local plasma layers in front of the ICRF antenna. ICRF power coupling,more » operational effects, and poloidal variations of the plasma density profile can be consistently studied for the first time. In this work the diagnostic hardware architecture is described and the obtained density profile measurements were used to track outer radial plasma position and plasma shape.« less

  14. High energy density Z-pinch plasmas using flow stabilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumlak, U., E-mail: shumlak@uw.edu; Golingo, R. P., E-mail: shumlak@uw.edu; Nelson, B. A., E-mail: shumlak@uw.edu

    The ZaP Flow Z-Pinch research project[1] at the University of Washington investigates the effect of sheared flows on MHD instabilities. Axially flowing Z-pinch plasmas are produced that are 100 cm long with a 1 cm radius. The plasma remains quiescent for many radial Alfvén times and axial flow times. The quiescent periods are characterized by low magnetic mode activity measured at several locations along the plasma column and by stationary visible plasma emission. Plasma evolution is modeled with high-resolution simulation codes – Mach2, WARPX, NIMROD, and HiFi. Plasma flow profiles are experimentally measured with a multi-chord ion Doppler spectrometer. Amore » sheared flow profile is observed to be coincident with the quiescent period, and is consistent with classical plasma viscosity. Equilibrium is determined by diagnostic measurements: interferometry for density; spectroscopy for ion temperature, plasma flow, and density[2]; Thomson scattering for electron temperature; Zeeman splitting for internal magnetic field measurements[3]; and fast framing photography for global structure. Wall stabilization has been investigated computationally and experimentally by removing 70% of the surrounding conducting wall to demonstrate no change in stability behavior.[4] Experimental evidence suggests that the plasma lifetime is only limited by plasma supply and current waveform. The flow Z-pinch concept provides an approach to achieve high energy density plasmas,[5] which are large, easy to diagnose, and persist for extended durations. A new experiment, ZaP-HD, has been built to investigate this approach by separating the flow Z-pinch formation from the radial compression using a triaxial-electrode configuration. This innovation allows more detailed investigations of the sheared flow stabilizing effect, and it allows compression to much higher densities than previously achieved on ZaP by reducing the linear density and increasing the pinch current. Experimental results and scaling analyses will be presented. In addition to studying fundamental plasma science and high energy density physics, the ZaP and ZaP-HD experiments can be applied to laboratory astrophysics.« less

  15. DensToolKit: A comprehensive open-source package for analyzing the electron density and its derivative scalar and vector fields

    NASA Astrophysics Data System (ADS)

    Solano-Altamirano, J. M.; Hernández-Pérez, Julio M.

    2015-11-01

    DensToolKit is a suite of cross-platform, optionally parallelized, programs for analyzing the molecular electron density (ρ) and several fields derived from it. Scalar and vector fields, such as the gradient of the electron density (∇ρ), electron localization function (ELF) and its gradient, localized orbital locator (LOL), region of slow electrons (RoSE), reduced density gradient, localized electrons detector (LED), information entropy, molecular electrostatic potential, kinetic energy densities K and G, among others, can be evaluated on zero, one, two, and three dimensional grids. The suite includes a program for searching critical points and bond paths of the electron density, under the framework of Quantum Theory of Atoms in Molecules. DensToolKit also evaluates the momentum space electron density on spatial grids, and the reduced density matrix of order one along lines joining two arbitrary atoms of a molecule. The source code is distributed under the GNU-GPLv3 license, and we release the code with the intent of establishing an open-source collaborative project. The style of DensToolKit's code follows some of the guidelines of an object-oriented program. This allows us to supply the user with a simple manner for easily implement new scalar or vector fields, provided they are derived from any of the fields already implemented in the code. In this paper, we present some of the most salient features of the programs contained in the suite, some examples of how to run them, and the mathematical definitions of the implemented fields along with hints of how we optimized their evaluation. We benchmarked our suite against both a freely-available program and a commercial package. Speed-ups of ˜2×, and up to 12× were obtained using a non-parallel compilation of DensToolKit for the evaluation of fields. DensToolKit takes similar times for finding critical points, compared to a commercial package. Finally, we present some perspectives for the future development and growth of the suite.

  16. Self-Configuration and Localization in Ad Hoc Wireless Sensor Networks

    DTIC Science & Technology

    2010-08-31

    Goddard I. SUMMARY OF CONTRIBUTIONS We explored the error mechanisms of iterative decoding of low-density parity-check ( LDPC ) codes . This work has resulted...important problems in the area of channel coding , as their unpredictable behavior has impeded the deployment of LDPC codes in many real-world applications. We...tree-based decoders of LDPC codes , including the extrinsic tree decoder, and an investigation into their performance and bounding capabilities [5], [6

  17. Topological entanglement entropy with a twist.

    PubMed

    Brown, Benjamin J; Bartlett, Stephen D; Doherty, Andrew C; Barrett, Sean D

    2013-11-27

    Defects in topologically ordered models have interesting properties that are reminiscent of the anyonic excitations of the models themselves. For example, dislocations in the toric code model are known as twists and possess properties that are analogous to Ising anyons. We strengthen this analogy by using the topological entanglement entropy as a diagnostic tool to identify properties of both defects and excitations in the toric code. Specifically, we show, through explicit calculation, that the toric code model including twists and dyon excitations has the same quantum dimensions, the same total quantum dimension, and the same fusion rules as an Ising anyon model.

  18. Matrix-Product-State Algorithm for Finite Fractional Quantum Hall Systems

    NASA Astrophysics Data System (ADS)

    Liu, Zhao; Bhatt, R. N.

    2015-09-01

    Exact diagonalization is a powerful tool to study fractional quantum Hall (FQH) systems. However, its capability is limited by the exponentially increasing computational cost. In order to overcome this difficulty, density-matrix-renormalization-group (DMRG) algorithms were developed for much larger system sizes. Very recently, it was realized that some model FQH states have exact matrix-product-state (MPS) representation. Motivated by this, here we report a MPS code, which is closely related to, but different from traditional DMRG language, for finite FQH systems on the cylinder geometry. By representing the many-body Hamiltonian as a matrix-product-operator (MPO) and using single-site update and density matrix correction, we show that our code can efficiently search the ground state of various FQH systems. We also compare the performance of our code with traditional DMRG. The possible generalization of our code to infinite FQH systems and other physical systems is also discussed.

  19. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.

    2014-10-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.

  20. Coupling of laser energy into plasma channels

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Giacone, R. E.; Bruhwiler, D. L.; Busby, R.; Cary, J. R.; Geddes, C. G. R.; Esarey, E.; Leemans, W. P.

    2007-04-01

    Diffractive spreading of a laser pulse imposes severe limitations on the acceleration length and maximum electron energy in the laser wake field accelerator (LWFA). Optical guiding of a laser pulse via plasma channels can extend the laser-plasma interaction distance over many Rayleigh lengths. Energy efficient coupling of laser pulses into and through plasma channels is very important for optimal LWFA performance. Results from simulation parameter studies on channel guiding using the particle-in-cell (PIC) code VORPAL [C. Nieter and J. R. Cary, J. Comput. Phys. 196, 448 (2004)] are presented and discussed. The effects that density ramp length and the position of the laser pulse focus have on coupling into channels are considered. Moreover, the effect of laser energy leakage out of the channel domain and the effects of tunneling ionization of a neutral gas on the guided laser pulse are also investigated. Power spectral diagnostics were developed and used to separate pump depletion from energy leakage. The results of these simulations show that increasing the density ramp length decreases the efficiency of coupling a laser pulse to a channel and increases the energy loss when the pulse is vacuum focused at the channel entrance. Then, large spot size oscillations result in increased energy leakage. To further analyze the coupling, a differential equation is derived for the laser spot size evolution in the plasma density ramp and channel profiles are simulated. From the numerical solution of this equation, the optimal spot size and location for coupling into a plasma channel with a density ramp are determined. This result is confirmed by the PIC simulations. They show that specifying a vacuum focus location of the pulse in front of the top of the density ramp leads to an actual focus at the top of the ramp due to plasma focusing, resulting in reduced spot size oscillations. In this case, the leakage is significantly reduced and is negligibly affected by ramp length, allowing for efficient use of channels with long ramps.

  1. Applications of digital processing for noise removal from plasma diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, R.J.; Candy, J.V.; Casper, T.A.

    1985-11-11

    The use of digital signal techniques for removal of noise components present in plasma diagnostic signals is discussed, particularly with reference to diamagnetic loop signals. These signals contain noise due to power supply ripple in addition to plasma characteristics. The application of noise canceling techniques, such as adaptive noise canceling and model-based estimation, will be discussed. The use of computer codes such as SIG is described. 19 refs., 5 figs.

  2. Validity of the International Classification of Diseases 10th revision code for hospitalisation with hyponatraemia in elderly patients

    PubMed Central

    Gandhi, Sonja; Shariff, Salimah Z; Fleet, Jamie L; Weir, Matthew A; Jain, Arsh K; Garg, Amit X

    2012-01-01

    Objective To evaluate the validity of the International Classification of Diseases, 10th Revision (ICD-10) diagnosis code for hyponatraemia (E87.1) in two settings: at presentation to the emergency department and at hospital admission. Design Population-based retrospective validation study. Setting Twelve hospitals in Southwestern Ontario, Canada, from 2003 to 2010. Participants Patients aged 66 years and older with serum sodium laboratory measurements at presentation to the emergency department (n=64 581) and at hospital admission (n=64 499). Main outcome measures Sensitivity, specificity, positive predictive value and negative predictive value comparing various ICD-10 diagnostic coding algorithms for hyponatraemia to serum sodium laboratory measurements (reference standard). Median serum sodium values comparing patients who were code positive and code negative for hyponatraemia. Results The sensitivity of hyponatraemia (defined by a serum sodium ≤132 mmol/l) for the best-performing ICD-10 coding algorithm was 7.5% at presentation to the emergency department (95% CI 7.0% to 8.2%) and 10.6% at hospital admission (95% CI 9.9% to 11.2%). Both specificities were greater than 99%. In the two settings, the positive predictive values were 96.4% (95% CI 94.6% to 97.6%) and 82.3% (95% CI 80.0% to 84.4%), while the negative predictive values were 89.2% (95% CI 89.0% to 89.5%) and 87.1% (95% CI 86.8% to 87.4%). In patients who were code positive for hyponatraemia, the median (IQR) serum sodium measurements were 123 (119–126) mmol/l and 125 (120–130) mmol/l in the two settings. In code negative patients, the measurements were 138 (136–140) mmol/l and 137 (135–139) mmol/l. Conclusions The ICD-10 diagnostic code for hyponatraemia differentiates between two groups of patients with distinct serum sodium measurements at both presentation to the emergency department and at hospital admission. However, these codes underestimate the true incidence of hyponatraemia due to low sensitivity. PMID:23274673

  3. Development of ITER non-activation phase operation scenarios

    DOE PAGES

    Kim, S. H.; Poli, F. M.; Koechl, F.; ...

    2017-06-29

    Non-activation phase operations in ITER in hydrogen (H) and helium (He) will be important for commissioning of tokamak systems, such as diagnostics, heating and current drive (HCD) systems, coils and plasma control systems, and for validation of techniques necessary for establishing operations in DT. The assessment of feasible HCD schemes at various toroidal fields (2.65–5.3 T) has revealed that the previously applied assumptions need to be refined for the ITER non-activation phase H/He operations. A study of the ranges of plasma density and profile shape using the JINTRAC suite of codes has indicated that the hydrogen pellet fuelling into Hemore » plasmas should be utilized taking the optimization of IC power absorption, neutral beam shine-through density limit and H-mode access into account. The EPED1 estimation of the edge pedestal parameters has been extended to various H operation conditions, and the combined EPED1 and SOLPS estimation has provided guidance for modelling the edge pedestal in H/He operations. The availability of ITER HCD schemes, ranges of achievable plasma density and profile shape, and estimation of the edge pedestal parameters for H/He plasmas have been integrated into various time-dependent tokamak discharge simulations. In this paper, various H/He scenarios at a wide range of plasma current (7.5–15 MA) and field (2.65–5.3 T) have been developed for the ITER non-activation phase operation, and the sensitivity of the developed scenarios to the used assumptions has been investigated to provide guidance for further development.« less

  4. Development of ITER non-activation phase operation scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S. H.; Poli, F. M.; Koechl, F.

    Non-activation phase operations in ITER in hydrogen (H) and helium (He) will be important for commissioning of tokamak systems, such as diagnostics, heating and current drive (HCD) systems, coils and plasma control systems, and for validation of techniques necessary for establishing operations in DT. The assessment of feasible HCD schemes at various toroidal fields (2.65–5.3 T) has revealed that the previously applied assumptions need to be refined for the ITER non-activation phase H/He operations. A study of the ranges of plasma density and profile shape using the JINTRAC suite of codes has indicated that the hydrogen pellet fuelling into Hemore » plasmas should be utilized taking the optimization of IC power absorption, neutral beam shine-through density limit and H-mode access into account. The EPED1 estimation of the edge pedestal parameters has been extended to various H operation conditions, and the combined EPED1 and SOLPS estimation has provided guidance for modelling the edge pedestal in H/He operations. The availability of ITER HCD schemes, ranges of achievable plasma density and profile shape, and estimation of the edge pedestal parameters for H/He plasmas have been integrated into various time-dependent tokamak discharge simulations. In this paper, various H/He scenarios at a wide range of plasma current (7.5–15 MA) and field (2.65–5.3 T) have been developed for the ITER non-activation phase operation, and the sensitivity of the developed scenarios to the used assumptions has been investigated to provide guidance for further development.« less

  5. Quantum error correcting codes and 4-dimensional arithmetic hyperbolic manifolds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guth, Larry, E-mail: lguth@math.mit.edu; Lubotzky, Alexander, E-mail: alex.lubotzky@mail.huji.ac.il

    2014-08-15

    Using 4-dimensional arithmetic hyperbolic manifolds, we construct some new homological quantum error correcting codes. They are low density parity check codes with linear rate and distance n{sup ε}. Their rate is evaluated via Euler characteristic arguments and their distance using Z{sub 2}-systolic geometry. This construction answers a question of Zémor [“On Cayley graphs, surface codes, and the limits of homological coding for quantum error correction,” in Proceedings of Second International Workshop on Coding and Cryptology (IWCC), Lecture Notes in Computer Science Vol. 5557 (2009), pp. 259–273], who asked whether homological codes with such parameters could exist at all.

  6. FSCATT: Angular Dependence and Filter Options.

    DTIC Science & Technology

    The input routines to the code have been completely rewritten to allow for a free-form input format. The input routines now provide self-consistency checks and diagnostics for the user’s edification .

  7. 40 CFR 85.2207 - On-board diagnostics test standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Code Definitions, (MAR92). This incorporation by reference was approved by the Director of the Federal... Society of Automotive Engineers, Inc., 400 Commonwealth Drive, Warrendale, PA 15096-0001. Copies may be...

  8. CXSFIT Code Application to Process Charge-Exchange Recombination Spectroscopy Data at the T-10 Tokamak

    NASA Astrophysics Data System (ADS)

    Serov, S. V.; Tugarinov, S. N.; Klyuchnikov, L. A.; Krupin, V. A.; von Hellermann, M.

    2017-12-01

    The applicability of the CXSFIT code to process experimental data from Charge-eXchange Recombination Spectroscopy (CXRS) diagnostics at the T-10 tokamak is studied with a view to its further use for processing experimental data at the ITER facility. The design and operating principle of the CXRS diagnostics are described. The main methods for processing the CXRS spectra of the 5291-Å line of C5+ ions at the T-10 tokamak (with and without subtraction of parasitic emission from the edge plasma) are analyzed. The method of averaging the CXRS spectra over several shots, which is used at the T-10 tokamak to increase the signal-to-noise ratio, is described. The approximation of the spectrum by a set of Gaussian components is used to identify the active CXRS line in the measured spectrum. Using the CXSFIT code, the ion temperature in ohmic discharges and discharges with auxiliary electron cyclotron resonance heating (ECRH) at the T-10 tokamak is calculated from the CXRS spectra of the 5291-Å line. The time behavior of the ion temperature profile in different ohmic heating modes is studied. The temperature profile dependence on the ECRH power is measured, and the dynamics of ECR removal of carbon nuclei from the T-10 plasma is described. Experimental data from the CXRS diagnostics at T-10 substantially contribute to the implementation of physical programs of studies on heat and particle transport in tokamak plasmas and investigation of geodesic acoustic mode properties.

  9. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  10. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    NASA Astrophysics Data System (ADS)

    Hartini, Entin; Andiwijayakusuma, Dinan

    2014-09-01

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  11. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  12. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartini, Entin, E-mail: entin@batan.go.id; Andiwijayakusuma, Dinan, E-mail: entin@batan.go.id

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuelmore » type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.« less

  13. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    NASA Astrophysics Data System (ADS)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  14. Study of ablation and implosion stages in wire arrays using coupled ultraviolet and X-ray probing diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, A. A.; Ivanov, V. V.; Astanovitskiy, A. L.

    2015-11-15

    Star and cylindrical wire arrays were studied using laser probing and X-ray radiography at the 1-MA Zebra pulse power generator at the University of Nevada, Reno. The Leopard laser provided backlighting, producing a laser plasma from a Si target which emitted an X-ray probing pulse at the wavelength of 6.65 Å. A spherically bent quartz crystal imaged the backlit wires onto X-ray film. Laser probing diagnostics at the wavelength of 266 nm included a 3-channel polarimeter for Faraday rotation diagnostic and two-frame laser interferometry with two shearing interferometers to study the evolution of the plasma electron density at the ablation and implosionmore » stages. Dynamics of the plasma density profile in Al wire arrays at the ablation stage were directly studied with interferometry, and expansion of wire cores was measured with X-ray radiography. The magnetic field in the imploding plasma was measured with the Faraday rotation diagnostic, and current was reconstructed.« less

  15. Rotational and vibrational Raman spectroscopy for thermochemistry measurements in supersonic flames

    NASA Astrophysics Data System (ADS)

    Bayeh, Alexander Christian

    High speed chemically reacting flows are important in a variety of aerospace applications, namely ramjets, scramjets, afterburners, and rocket exhausts. To study flame extinction under similar high Mach number conditions, we need access to thermochemistry measurements in supersonic environments. In the current work a two-stage miniaturized combustor has been designed that can produce open supersonic methane-air flames amenable to laser diagnostics. The first stage is a vitiation burner, and was inspired by well-known principles of jet combustors. We explored the salient parameters of operation experimentally, and verified flame holding computationally using a well-stirred reactor model. The second stage of the burner generates an external supersonic flame, operating in premixed and partially premixed modes. The very high Mach numbers present in the supersonic flames should provide a useful test bed for the examination of flame suppression and extinction using laser diagnostics. We also present the development of new line imaging diagnostics for thermochemistry measurements in high speed flows. A novel combination of vibrational and rotational Raman scattering is used to measure major species densities (O 2, N2, CH4, H2O,CO2, CO, & H2) and temperature. Temperature is determined by the rotational Raman technique by comparing measured rotational spectra to simulated spectra based on the measured chemical composition. Pressure is calculated from density and temperature measurements through the ideal gas law. The independent assessment of density and temperature allows for measurements in environments where the pressure is not known a priori. In the present study we applied the diagnostics to laboratory scale supersonic air and vitiation jets, and examine the feasibility of such measurements in reacting supersonic flames. Results of full thermochemistry were obtained for the air and vitiation jets that reveal the expected structure of an under-expanded jet. Centerline traces of density, temperature, and pressure of the air jet agree well with computations, while measurements of chemical composition for the vitiation flow also agree well with predicted equilibrium values. Finally, we apply the new diagnostics to the exhaust of the developed burner, and show the first ever results for density, temperature, and pressure, as well as chemical composition in a supersonic flame.

  16. Quantum image pseudocolor coding based on the density-stratified method

    NASA Astrophysics Data System (ADS)

    Jiang, Nan; Wu, Wenya; Wang, Luo; Zhao, Na

    2015-05-01

    Pseudocolor processing is a branch of image enhancement. It dyes grayscale images to color images to make the images more beautiful or to highlight some parts on the images. This paper proposes a quantum image pseudocolor coding scheme based on the density-stratified method which defines a colormap and changes the density value from gray to color parallel according to the colormap. Firstly, two data structures: quantum image GQIR and quantum colormap QCR are reviewed or proposed. Then, the quantum density-stratified algorithm is presented. Based on them, the quantum realization in the form of circuits is given. The main advantages of the quantum version for pseudocolor processing over the classical approach are that it needs less memory and can speed up the computation. Two kinds of examples help us to describe the scheme further. Finally, the future work are analyzed.

  17. Varying impacts of alcohol outlet densities on violent assaults: explaining differences across neighborhoods.

    PubMed

    Mair, Christina; Gruenewald, Paul J; Ponicki, William R; Remer, Lillian

    2013-01-01

    Groups of potentially violent drinkers may frequent areas of communities with large numbers of alcohol outlets, especially bars, leading to greater rates of alcohol-related assaults. This study assessed direct and moderating effects of bar densities on assaults across neighborhoods. We analyzed longitudinal population data relating alcohol outlet densities (total outlet density, proportion bars/pubs, proportion off-premise outlets) to hospitalizations for assault injuries in California across residential ZIP code areas from 1995 through 2008 (23,213 space-time units). Because few ZIP codes were consistently defined over 14 years and these units are not independent, corrections for unit misalignment and spatial autocorrelation were implemented using Bayesian space-time conditional autoregressive models. Assaults were related to outlet densities in local and surrounding areas, the mix of outlet types, and neighborhood characteristics. The addition of one outlet per square mile was related to a small 0.23% increase in assaults. A 10% greater proportion of bars in a ZIP code was related to 7.5% greater assaults, whereas a 10% greater proportion of bars in surrounding areas was related to 6.2% greater assaults. The impacts of bars were much greater in areas with low incomes and dense populations. The effect of bar density on assault injuries was well supported and positive, and the magnitude of the effect varied by neighborhood characteristics. Posterior distributions from these models enabled the identification of locations most vulnerable to problems related to alcohol outlets.

  18. Information theoretical assessment of image gathering and coding for digital restoration

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; John, Sarah; Reichenbach, Stephen E.

    1990-01-01

    The process of image-gathering, coding, and restoration is presently treated in its entirety rather than as a catenation of isolated tasks, on the basis of the relationship between the spectral information density of a transmitted signal and the restorability of images from the signal. This 'information-theoretic' assessment accounts for the information density and efficiency of the acquired signal as a function of the image-gathering system's design and radiance-field statistics, as well as for the information efficiency and data compression that are obtainable through the combination of image gathering with coding to reduce signal redundancy. It is found that high information efficiency is achievable only through minimization of image-gathering degradation as well as signal redundancy.

  19. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  20. Product code optimization for determinate state LDPC decoding in robust image transmission.

    PubMed

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2006-08-01

    We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.

  1. Rate-Compatible Protograph LDPC Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.

  2. Prospects for Nonlinear Laser Diagnostics in the Jet Noise Laboratory

    NASA Technical Reports Server (NTRS)

    Herring, Gregory C.; Hart, Roger C.; Fletcher, mark T.; Balla, R. Jeffrey; Henderson, Brenda S.

    2007-01-01

    Two experiments were conducted to test whether optical methods, which rely on laser beam coherence, would be viable for off-body flow measurement in high-density, compressible-flow wind tunnels. These tests measured the effects of large, unsteady density gradients on laser diagnostics like laser-induced thermal acoustics (LITA). The first test was performed in the Low Speed Aeroacoustics Wind Tunnel (LSAWT) of NASA Langley Research Center's Jet Noise Laboratory (JNL). This flow facility consists of a dual-stream jet engine simulator (with electric heat and propane burners) exhausting into a simulated flight stream, reaching Mach numbers up to 0.32. A laser beam transited the LSAWT flow field and was imaged with a high-speed gated camera to measure beam steering and transverse mode distortion. A second, independent test was performed on a smaller laboratory jet (Mach number < 1.2 and mass flow rate < 0.1 kg/sec). In this test, time-averaged LITA velocimetry and thermometry were performed at the jet exit plane, where the effect of unsteady density gradients is observed on the LITA signal. Both experiments show that LITA (and other diagnostics relying on beam overlap or coherence) faces significant hurdles in the high-density, compressible, and turbulent flow environments similar to those of the JNL.

  3. Performance of Low-Density Parity-Check Coded Modulation

    NASA Astrophysics Data System (ADS)

    Hamkins, J.

    2011-02-01

    This article presents the simulated performance of a family of nine AR4JA low-density parity-check (LDPC) codes when used with each of five modulations. In each case, the decoder inputs are codebit log-likelihood ratios computed from the received (noisy) modulation symbols using a general formula which applies to arbitrary modulations. Suboptimal soft-decision and hard-decision demodulators are also explored. Bit-interleaving and various mappings of bits to modulation symbols are considered. A number of subtle decoder algorithm details are shown to affect performance, especially in the error floor region. Among these are quantization dynamic range and step size, clipping degree-one variable nodes, "Jones clipping" of variable nodes, approximations of the min* function, and partial hard-limiting messages from check nodes. Using these decoder optimizations, all coded modulations simulated here are free of error floors down to codeword error rates below 10^{-6}. The purpose of generating this performance data is to aid system engineers in determining an appropriate code and modulation to use under specific power and bandwidth constraints, and to provide information needed to design a variable/adaptive coded modulation (VCM/ACM) system using the AR4JA codes. IPNPR Volume 42-185 Tagged File.txt

  4. Direct G-code manipulation for 3D material weaving

    NASA Astrophysics Data System (ADS)

    Koda, S.; Tanaka, H.

    2017-04-01

    The process of conventional 3D printing begins by first build a 3D model, then convert to the model to G-code via a slicer software, feed the G-code to the printer, and finally start the printing. The most simple and popular 3D printing technique is Fused Deposition Modeling. However, in this method, the printing path that the printer head can take is restricted by the G-code. Therefore the printed 3D models with complex pattern have structural errors like holes or gaps between the printed material lines. In addition, the structural density and the material's position of the printed model are difficult to control. We realized the G-code editing, Fabrix, for making a more precise and functional printed model with both single and multiple material. The models with different stiffness are fabricated by the controlling the printing density of the filament materials with our method. In addition, the multi-material 3D printing has a possibility to expand the physical properties by the material combination and its G-code editing. These results show the new printing method to provide more creative and functional 3D printing techniques.

  5. Study of edge turbulence in dimensionally similar laboratory plasmas

    NASA Astrophysics Data System (ADS)

    Stroth, Ulrich

    2003-10-01

    In recent years, the numerical simulation of turbulence has made considerable progress. Predictions are made for large plasma volumes taking into account realistic magnetic geometries. Because of diagnostic limitations, in fusion plasmas the means of experimental testing of the models are rather limited. Toroidal low-temperature plasmas offer the possibility for detailed comparisons between experiment and simulation. Due to the reduced plasma parameters, the relevant quantities can be measured in the entire plasma. At the same time, the relevant non-dimensional parameters can be comparable to those in the edge of fusion plasmas. This presentation reports on results from the torsatron TJ-K [1,2] operated with a low-temperature plasma. The data are compared with simulations using the drift-Alfven-wave code DALF3 [3]. Langmuir probe arrays with 64 tips are used to measure the spatial structure of the turbulence. The same analyses techniques are applied to experimental and numerical data. The measured properties of spectra and probability density functions are reproduced by the code. Although the plasma in experiment and simulation does not exhibit critical pressure gradients, the radial transport fluctuations are strongly intermittent in both cases. Using Hydrogen, Helium and Argon as working gases, the scale parameter ρs could be varied by more than a factor of ten. As predicted by theory, the size of the turbulent eddies increases with ρ_s. The measured cross-phase between density and potential fluctuations are small, indicating the importance of the drift-wave dynamics for the turbulence in toroidal plasmas. The wave number spectra decay with an exponent of -3 as one would expect for the enstrophy cascade in 2D turbulence. [1] N. Krause et al., Rev. Sci. Instr. 73, 3474 (2002) [2] C. Lechte et al., New J. of Physics 4, 34 (2002) [3] B. Scott, Plasma Phys. Contr. Fusion 39, 1635 (1997)

  6. Fishing diseased abalone to promote yield and conservation

    PubMed Central

    Ben-Horin, Tal; Bidegain, Gorka; Lenihan, Hunter S.

    2016-01-01

    Past theoretical models suggest fishing disease-impacted stocks can reduce parasite transmission, but this is a good management strategy only when the exploitation required to reduce transmission does not overfish the stock. We applied this concept to a red abalone fishery so impacted by an infectious disease (withering syndrome) that stock densities plummeted and managers closed the fishery. In addition to the non-selective fishing strategy considered by past disease-fishing models, we modelled targeting (culling) infected individuals, which is plausible in red abalone because modern diagnostic tools can determine infection without harming landed abalone and the diagnostic cost is minor relative to the catch value. The non-selective abalone fishing required to eradicate parasites exceeded thresholds for abalone sustainability, but targeting infected abalone allowed the fishery to generate yield and reduce parasite prevalence while maintaining stock densities at or above the densities attainable if the population was closed to fishing. The effect was strong enough that stock and yield increased even when the catch was one-third uninfected abalone. These results could apply to other fisheries as the diagnostic costs decline relative to catch value. PMID:26880843

  7. Fishing diseased abalone to promote yield and conservation.

    PubMed

    Ben-Horin, Tal; Lafferty, Kevin D; Bidegain, Gorka; Lenihan, Hunter S

    2016-03-05

    Past theoretical models suggest fishing disease-impacted stocks can reduce parasite transmission, but this is a good management strategy only when the exploitation required to reduce transmission does not overfish the stock. We applied this concept to a red abalone fishery so impacted by an infectious disease (withering syndrome) that stock densities plummeted and managers closed the fishery. In addition to the non-selective fishing strategy considered by past disease-fishing models, we modelled targeting (culling) infected individuals, which is plausible in red abalone because modern diagnostic tools can determine infection without harming landed abalone and the diagnostic cost is minor relative to the catch value. The non-selective abalone fishing required to eradicate parasites exceeded thresholds for abalone sustainability, but targeting infected abalone allowed the fishery to generate yield and reduce parasite prevalence while maintaining stock densities at or above the densities attainable if the population was closed to fishing. The effect was strong enough that stock and yield increased even when the catch was one-third uninfected abalone. These results could apply to other fisheries as the diagnostic costs decline relative to catch value. © 2016 The Author(s).

  8. Fishing diseased abalone to promote yield and conservation

    USGS Publications Warehouse

    Ben-Horin, Tal; Lafferty, Kevin D.; Bidegain, Gorka; Lenihan, Hunter S.

    2016-01-01

    Past theoretical models suggest fishing disease-impacted stocks can reduce parasite transmission, but this is a good management strategy only when the exploitation required to reduce transmission does not overfish the stock. We applied this concept to a red abalone fishery so impacted by an infectious disease (withering syndrome) that stock densities plummeted and managers closed the fishery. In addition to the non-selective fishing strategy considered by past disease-fishing models, we modelled targeting (culling) infected individuals, which is plausible in red abalone because modern diagnostic tools can determine infection without harming landed abalone and the diagnostic cost is minor relative to the catch value. The non-selective abalone fishing required to eradicate parasites exceeded thresholds for abalone sustainability, but targeting infected abalone allowed the fishery to generate yield and reduce parasite prevalence while maintaining stock densities at or above the densities attainable if the population was closed to fishing. The effect was strong enough that stock and yield increased even when the catch was one-third uninfected abalone. These results could apply to other fisheries as the diagnostic costs decline relative to catch value.

  9. Quantum-dot-tagged microbeads for multiplexed optical coding of biomolecules.

    PubMed

    Han, M; Gao, X; Su, J Z; Nie, S

    2001-07-01

    Multicolor optical coding for biological assays has been achieved by embedding different-sized quantum dots (zinc sulfide-capped cadmium selenide nanocrystals) into polymeric microbeads at precisely controlled ratios. Their novel optical properties (e.g., size-tunable emission and simultaneous excitation) render these highly luminescent quantum dots (QDs) ideal fluorophores for wavelength-and-intensity multiplexing. The use of 10 intensity levels and 6 colors could theoretically code one million nucleic acid or protein sequences. Imaging and spectroscopic measurements indicate that the QD-tagged beads are highly uniform and reproducible, yielding bead identification accuracies as high as 99.99% under favorable conditions. DNA hybridization studies demonstrate that the coding and target signals can be simultaneously read at the single-bead level. This spectral coding technology is expected to open new opportunities in gene expression studies, high-throughput screening, and medical diagnostics.

  10. Diagnostic classification of macular ganglion cell and retinal nerve fiber layer analysis: differentiation of false-positives from glaucoma.

    PubMed

    Kim, Ko Eun; Jeoung, Jin Wook; Park, Ki Ho; Kim, Dong Myung; Kim, Seok Hwan

    2015-03-01

    To investigate the rate and associated factors of false-positive diagnostic classification of ganglion cell analysis (GCA) and retinal nerve fiber layer (RNFL) maps, and characteristic false-positive patterns on optical coherence tomography (OCT) deviation maps. Prospective, cross-sectional study. A total of 104 healthy eyes of 104 normal participants. All participants underwent peripapillary and macular spectral-domain (Cirrus-HD, Carl Zeiss Meditec Inc, Dublin, CA) OCT scans. False-positive diagnostic classification was defined as yellow or red color-coded areas for GCA and RNFL maps. Univariate and multivariate logistic regression analyses were used to determine associated factors. Eyes with abnormal OCT deviation maps were categorized on the basis of the shape and location of abnormal color-coded area. Differences in clinical characteristics among the subgroups were compared. (1) The rate and associated factors of false-positive OCT maps; (2) patterns of false-positive, color-coded areas on the GCA deviation map and associated clinical characteristics. Of the 104 healthy eyes, 42 (40.4%) and 32 (30.8%) showed abnormal diagnostic classifications on any of the GCA and RNFL maps, respectively. Multivariate analysis revealed that false-positive GCA diagnostic classification was associated with longer axial length and larger fovea-disc angle, whereas longer axial length and smaller disc area were associated with abnormal RNFL maps. Eyes with abnormal GCA deviation map were categorized as group A (donut-shaped round area around the inner annulus), group B (island-like isolated area), and group C (diffuse, circular area with an irregular inner margin in either). The axial length showed a significant increasing trend from group A to C (P=0.001), and likewise, the refractive error was more myopic in group C than in groups A (P=0.015) and B (P=0.014). Group C had thinner average ganglion cell-inner plexiform layer thickness compared with other groups (group A=B>C, P=0.004). Abnormal OCT diagnostic classification should be interpreted with caution, especially in eyes with long axial lengths, large fovea-disc angles, and small optic discs. Our findings suggest that the characteristic patterns of OCT deviation map can provide useful clues to distinguish glaucomatous changes from false-positive findings. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  11. Internal Carotid Artery Hypoplasia: Role of Color-Coded Carotid Duplex Sonography.

    PubMed

    Chen, Pei-Ya; Liu, Hung-Yu; Lim, Kun-Eng; Lin, Shinn-Kuang

    2015-10-01

    The purpose of this study was to determine the role of color-coded carotid duplex sonography for diagnosis of internal carotid artery hypoplasia. We retrospectively reviewed 25,000 color-coded carotid duplex sonograms in our neurosonographic database to establish more diagnostic criteria for internal carotid artery hypoplasia. A definitive diagnosis of internal carotid artery hypoplasia was made in 9 patients. Diagnostic findings on color-coded carotid duplex imaging include a long segmental small-caliber lumen (52% diameter) with markedly decreased flow (13% flow volume) in the affected internal carotid artery relative to the contralateral side but without intraluminal lesions. Indirect findings included markedly increased total flow volume (an increase of 133%) in both vertebral arteries, antegrade ipsilateral ophthalmic arterial flow, and a reduced vessel diameter with increased flow resistance in the ipsilateral common carotid artery. Ten patients with distal internal carotid artery dissection showed a similar color-coded duplex pattern, but the reductions in the internal and common carotid artery diameters and increase in collateral flow from the vertebral artery were less prominent than those in hypoplasia. The ipsilateral ophthalmic arterial flow was retrograde in 40% of patients with distal internal carotid artery dissection. In addition, thin-section axial and sagittal computed tomograms of the skull base could show the small diameter of the carotid canal in internal carotid artery hypoplasia and help distinguish hypoplasia from distal internal carotid artery dissection. Color-coded carotid duplex sonography provides important clues for establishing a diagnosis of internal carotid artery hypoplasia. A hypoplastic carotid canal can be shown by thin-section axial and sagittal skull base computed tomography to confirm the final diagnosis. © 2015 by the American Institute of Ultrasound in Medicine.

  12. An interactive toolbox for atlas-based segmentation and coding of volumetric images

    NASA Astrophysics Data System (ADS)

    Menegaz, G.; Luti, S.; Duay, V.; Thiran, J.-Ph.

    2007-03-01

    Medical imaging poses the great challenge of having compression algorithms that are lossless for diagnostic and legal reasons and yet provide high compression rates for reduced storage and transmission time. The images usually consist of a region of interest representing the part of the body under investigation surrounded by a "background", which is often noisy and not of diagnostic interest. In this paper, we propose a ROI-based 3D coding system integrating both the segmentation and the compression tools. The ROI is extracted by an atlas based 3D segmentation method combining active contours with information theoretic principles, and the resulting segmentation map is exploited for ROI based coding. The system is equipped with a GUI allowing the medical doctors to supervise the segmentation process and eventually reshape the detected contours at any point. The process is initiated by the user through the selection of either one pre-de.ned reference image or one image of the volume to be used as the 2D "atlas". The object contour is successively propagated from one frame to the next where it is used as the initial border estimation. In this way, the entire volume is segmented based on a unique 2D atlas. The resulting 3D segmentation map is exploited for adaptive coding of the different image regions. Two coding systems were considered: the JPEG3D standard and the 3D-SPITH. The evaluation of the performance with respect to both segmentation and coding proved the high potential of the proposed system in providing an integrated, low-cost and computationally effective solution for CAD and PAC systems.

  13. Tokamak plasma high field side response to an n = 3 magnetic perturbation: a comparison of 3D equilibrium solutions from seven different codes

    NASA Astrophysics Data System (ADS)

    Reiman, A.; Ferraro, N. M.; Turnbull, A.; Park, J. K.; Cerfon, A.; Evans, T. E.; Lanctot, M. J.; Lazarus, E. A.; Liu, Y.; McFadden, G.; Monticello, D.; Suzuki, Y.

    2015-06-01

    In comparing equilibrium solutions for a DIII-D shot that is amenable to analysis by both stellarator and tokamak three-dimensional (3D) equilibrium codes, a significant disagreement has been seen between solutions of the VMEC stellarator equilibrium code and solutions of tokamak perturbative 3D equilibrium codes. The source of that disagreement has been investigated, and that investigation has led to new insights into the domain of validity of the different equilibrium calculations, and to a finding that the manner in which localized screening currents at low order rational surfaces are handled can affect global properties of the equilibrium solution. The perturbative treatment has been found to break down at surprisingly small perturbation amplitudes due to overlap of the calculated perturbed flux surfaces, and that treatment is not valid in the pedestal region of the DIII-D shot studied. The perturbative treatment is valid, however, further into the interior of the plasma, and flux surface overlap does not account for the disagreement investigated here. Calculated equilibrium solutions for simple model cases and comparison of the 3D equilibrium solutions with those of other codes indicate that the disagreement arises from a difference in handling of localized currents at low order rational surfaces, with such currents being absent in VMEC and present in the perturbative codes. The significant differences in the global equilibrium solutions associated with the presence or absence of very localized screening currents at rational surfaces suggests that it may be possible to extract information about localized currents from appropriate measurements of global equilibrium plasma properties. That would require improved diagnostic capability on the high field side of the tokamak plasma, a region difficult to access with diagnostics.

  14. Use of a microwave diagnostics technique to measure the temperature of an axisymmetric ionized gas flow

    NASA Astrophysics Data System (ADS)

    Tsel'Sov, Iu. G.; Kondrat'ev, A. S.

    1990-12-01

    A method is developed for determining the temperature of an ionized gas on the basis of electron-density sounding. This technique is used to measure the cross-sectional temperature distribution of an axisymmetric ionized gas flow using microwave diagnostics.

  15. Prenatal Genetic Testing Chart

    MedlinePlus

    ... www.acog.org/Patients/FAQs/Prenatal-Genetic-Diagnostic-Tests › › Resources & Publications Committee Opinions Practice Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality ...

  16. Implementation of an Unequal Path Length, Heterodyne Interferometer on the MOCHI LabJet Experiment

    NASA Astrophysics Data System (ADS)

    Card, Alexander Harrison

    The MOCHI LabJet experiment aims to explore the stability of magnetic flux tubes through the medium of laboratory astrophysical plasmas. The boundary conditions of large gravitational bodies, namely accretion disks, are replicated and allowed to influence a plasma over short timescales. Observation of the plasma is enabled through use of a variety of fast diagnostics, including an unequal path length, heterodyne, quadrature phase differential interferometer, the development and implementation of which is described in detail. The LabJet gun, a triple-electrode planar plasma gun featuring azimuthally symmetric gas injection achieves a new, long-duration, highly-stabilized, jet plasma formation. The line-integrated density in this new LabJet formation is found to be ne = (6 +/- 3)x1020 [m-2]. By observing the axial expansion rate of the jet over multiple chord locations (all perpendicular to the propagation axis), the interferometer provides an Alfvén velocity measurement of vA = 41.3 +/- 5.4 [km/s], which at the jet density observed indicates an axial magnetic field strength of Bz = 0.15 +/- 0.04 [T]. Various other laboratory components are also detailed, such as a shot-based MDSplus data storage architecture implemented into the LabVIEW experiment control code, and the production and performance of ten fast neutral gas injection valves which when fired in unison provide a total particle inventory of (7.8 +/- 0.6)x1023 [HI particles].

  17. Modeling ICF With RAGE, BHR, And The New Laser Package

    NASA Astrophysics Data System (ADS)

    Cliche, Dylan; Welser-Sherrill, Leslie; Haines, Brian; Mancini, Roberto

    2017-10-01

    Inertial Confinement Fusion (ICF) is one method used to obtain thermonuclear burn through the either direct or indirect ablation of a millimeter-scale capsule with several lasers. Although progress has been made in theory, experiment, and diagnostics, the community has yet to reach ignition. A way of investigating this is through the use of high performance computer simulations of the implosion. RAGE is an advanced 1D, 2D, and 3D radiation adaptive grid Eulerian code used to simulate hydrodynamics of a system. Due to the unstable nature of two unequal densities accelerating into one another, it is important to include a turbulence model. BHR is a turbulence model which uses Reynolds-averaged Navier-Stokes (RANS) equations to model the mixing that occurs between the shell and fusion fuel material. Until recently, it was still difficult to model direct drive experiments because there was no laser energy deposition model in RAGE. Recently, a new laser energy deposition model has been implemented using the same ray tracing method as the Mazinisin laser package used at the OMEGA laser facility at the Laboratory for Laser Energetics (LLE) in Rochester, New York. Using the new laser package along with BHR for mixing allows us to more accurately simulate ICF implosions and obtain spatially and temporally resolved information (e.g. position, temperature, density, and mix concentrations) to give insight into what is happening inside the implosion.

  18. Thin Shell Model for NIF capsule stagnation studies

    NASA Astrophysics Data System (ADS)

    Hammer, J. H.; Buchoff, M.; Brandon, S.; Field, J. E.; Gaffney, J.; Kritcher, A.; Nora, R. C.; Peterson, J. L.; Spears, B.; Springer, P. T.

    2015-11-01

    We adapt the thin shell model of Ott et al. to asymmetric ICF capsule implosions on NIF. Through much of an implosion, the shell aspect ratio is large so the thin shell approximation is well satisfied. Asymmetric pressure drive is applied using an analytic form for ablation pressure as a function of the x-ray flux, as well as time-dependent 3D drive asymmetry from hohlraum calculations. Since deviations from a sphere are small through peak velocity, we linearize the equations, decompose them by spherical harmonics and solve ODE's for the coefficients. The model gives the shell position, velocity and areal mass variations at the time of peak velocity, near 250 microns radius. The variables are used to initialize 3D rad-hydro calculations with the HYDRA and ARES codes. At link time the cold fuel shell and ablator are each characterized by a density, adiabat and mass. The thickness, position and velocity of each point are taken from the thin shell model. The interior of the shell is filled with a uniform gas density and temperature consistent with the 3/2PV energy found from 1D rad-hydro calculations. 3D linked simulations compare favorably with integrated simulations of the entire implosion. Through generating synthetic diagnostic data, the model offers a method for quickly testing hypothetical sources of asymmetry and comparing with experiment. Prepared by LLNL under Contract DE-AC52-07NA27344.

  19. Study of laser preheating dependence on laser wavelength and intensity for MagLIF

    NASA Astrophysics Data System (ADS)

    Wei, M. S.; Harvey-Thompson, A. J.; Glinsky, M.; Nagayama, T.; Weis, M.; Geissel, M.; Peterson, K.; Fooks, J.; Krauland, C.; Giraldez, E.; Davies, J.; Campbell, E. M.; Bahr, R.; Edgell, D.; Stoeckl, C.; Glebov, V.; Emig, J.; Heeter, R.; Strozzi, D.

    2017-10-01

    The magnetized liner inertial fusion (MagLIF) scheme requires preheating underdense fuel to 100's eV temperature by a TW-scale long pulse laser via collisional absorption. To better understand how laser preheat scales with laser wavelength and intensity as well as to provide data for code validation, we have conducted a well-characterized experiment on OMEGA to directly compare laser propagation, energy deposition and laser plasma instabilities (LPI) using 2 ω (527 nm) and 3 ω (351 nm) lasers with intensity in the range of (1-5)x1014 Wcm-2. The laser beam (1 - 1.5 ns square pulse) enters the gas-filled plastic liner though a 2-µm thick polyimide window to heat an underdense Ar-doped deuterium gas with electron density of 5.5% of critical density. Laser propagation and plasma temperature are diagnosed by time-resolved 2D x-ray images and Ar emission spectroscopy, respectively. LPI is monitored by backscattering and hard x-ray diagnostics. The 2 ω beam propagation shows a noticeable larger lateral spread than the 3 ω beam, indicating laser spray due to filamentation. LPI is observed to increase with laser intensity and the 2 ω beam produces more hot electrons compared with the 3 ω beam under similar conditions. Results will be compared with radiation hydrodynamic simulations. Work supported by the U.S. DOE ARPA-E and NNSA.

  20. SPIDER beam dump as diagnostic of the particle beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaupa, M., E-mail: matteo.zaupa@igi.cnr.it; Sartori, E.; Consorzio RFX, Corso Stati Uniti 4, Padova 35127

    The beam power produced by the negative ion source for the production of ion of deuterium extracted from RF plasma is mainly absorbed by the beam dump component which has been designed also for measuring the temperatures on the dumping panels for beam diagnostics. A finite element code has been developed to characterize, by thermo-hydraulic analysis, the sensitivity of the beam dump to the different beam parameters. The results prove the capability of diagnosing the beam divergence and the horizontal misalignment, while the entity of the halo fraction appears hardly detectable without considering the other foreseen diagnostics like tomography andmore » beam emission spectroscopy.« less

Top