Sample records for ler screening algorithm

  1. Development and Testing of the New Surface LER Climatology for OMI UV Aerosol Retrievals

    NASA Technical Reports Server (NTRS)

    Gupta, Pawan; Torres, Omar; Jethva, Hiren; Ahn, Changwoo

    2014-01-01

    Ozone Monitoring Instrument (OMI) onboard Aura satellite retrieved aerosols properties using UV part of solar spectrum. The OMI near UV aerosol algorithm (OMAERUV) is a global inversion scheme which retrieves aerosol properties both over ocean and land. The current version of the algorithm makes use of TOMS derived Lambertian Equivalent Reflectance (LER) climatology. A new monthly climatology of surface LER at 354 and 388 nm have been developed. This will replace TOMS LER (380 nm and 354nm) climatology in OMI near UV aerosol retrieval algorithm. The main objectives of this study is to produce high resolution (quarter degree) surface LER sets as compared to existing one degree TOMS surface LERs, to product instrument and wavelength consistent surface climatology. Nine years of OMI observations have been used to derive monthly climatology of surface LER. MODIS derived aerosol optical depth (AOD) have been used to make aerosol corrections on OMI wavelengths. MODIS derived BRDF adjusted reflectance product has been also used to capture seasonal changes in the surface characteristics. Finally spatial and temporal averaging techniques have been used to fill the gaps around the globes, especially in the regions with consistent cloud cover such as Amazon. After implementation of new surface data in the research version of algorithm, comparisons of AOD and single scattering albedo (SSA) have been performed over global AERONET sites for year 2007. Preliminary results shows improvements in AOD retrievals globally but more significance improvement were observed over desert and bright locations. We will present methodology of deriving surface data sets and will discuss the observed changes in retrieved aerosol properties with respect to reference AERONET measurements.

  2. Accounting for the effects of surface BRDF on satellite cloud and trace-gas retrievals: a new approach based on geometry-dependent Lambertian equivalent reflectivity applied to OMI algorithms

    NASA Astrophysics Data System (ADS)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50 % in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  3. Accounting for the Effects of Surface BRDF on Satellite Cloud and Trace-Gas Retrievals: A New Approach Based on Geometry-Dependent Lambertian-Equivalent Reflectivity Applied to OMI Algorithms

    NASA Technical Reports Server (NTRS)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50% in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  4. Measurement of pattern roughness and local size variation using CD-SEM: current status

    NASA Astrophysics Data System (ADS)

    Fukuda, Hiroshi; Kawasaki, Takahiro; Kawada, Hiroki; Sakai, Kei; Kato, Takashi; Yamaguchi, Satoru; Ikota, Masami; Momonoi, Yoshinori

    2018-03-01

    Measurement of line edge roughness (LER) is discussed from four aspects: edge detection, PSD prediction, sampling strategy, and noise mitigation, and general guidelines and practical solutions for LER measurement today are introduced. Advanced edge detection algorithms such as wave-matching method are shown effective for robustly detecting edges from low SNR images, while conventional algorithm with weak filtering is still effective in suppressing SEM noise and aliasing. Advanced PSD prediction method such as multi-taper method is effective in suppressing sampling noise within a line edge to analyze, while number of lines is still required for suppressing line to line variation. Two types of SEM noise mitigation methods, "apparent noise floor" subtraction method and LER-noise decomposition using regression analysis are verified to successfully mitigate SEM noise from PSD curves. These results are extended to LCDU measurement to clarify the impact of SEM noise and sampling noise on LCDU.

  5. Using Neural Networks to Improve the Performance of Radiative Transfer Modeling Used for Geometry Dependent Surface Lambertian-Equivalent Reflectivity Calculations

    NASA Technical Reports Server (NTRS)

    Fasnacht, Zachary; Qin, Wenhan; Haffner, David P.; Loyola, Diego; Joiner, Joanna; Krotkov, Nickolay; Vasilkov, Alexander; Spurr, Robert

    2017-01-01

    Surface Lambertian-equivalent reflectivity (LER) is important for trace gas retrievals in the direct calculation of cloud fractions and indirect calculation of the air mass factor. Current trace gas retrievals use climatological surface LER's. Surface properties that impact the bidirectional reflectance distribution function (BRDF) as well as varying satellite viewing geometry can be important for retrieval of trace gases. Geometry Dependent LER (GLER) captures these effects with its calculation of sun normalized radiances (I/F) and can be used in current LER algorithms (Vasilkov et al. 2016). Pixel by pixel radiative transfer calculations are computationally expensive for large datasets. Modern satellite missions such as the Tropospheric Monitoring Instrument (TROPOMI) produce very large datasets as they take measurements at much higher spatial and spectral resolutions. Look up table (LUT) interpolation improves the speed of radiative transfer calculations but complexity increases for non-linear functions. Neural networks perform fast calculations and can accurately predict both non-linear and linear functions with little effort.

  6. The line roughness improvement with plasma coating and cure treatment for 193nm lithography and beyond

    NASA Astrophysics Data System (ADS)

    Zheng, Erhu; Huang, Yi; Zhang, Haiyang

    2017-03-01

    As CMOS technology reaches 14nm node and beyond, one of the key challenges of the extension of 193nm immersion lithography is how to control the line edge and width roughness (LER/LWR). For Self-aligned Multiple Patterning (SaMP), LER becomes larger while LWR becomes smaller as the process proceeds[1]. It means plasma etch process becomes more and more dominant for LER reduction. In this work, we mainly focus on the core etch solution including an extra plasma coating process introduced before the bottom anti reflective coating (BARC) open step, and an extra plasma cure process applied right after BARC-open step. Firstly, we leveraged the optimal design experiment (ODE) to investigate the impact of plasma coating step on LER and identified the optimal condition. ODE is an appropriate method for the screening experiments of non-linear parameters in dynamic process models, especially for high-cost-intensive industry [2]. Finally, we obtained the proper plasma coating treatment condition that has been proven to achieve 32% LER improvement compared with standard process. Furthermore, the plasma cure scheme has been also optimized with ODE method to cover the LWR degradation induced by plasma coating treatment.

  7. Earth's UV Reflectivity Data from the Ozone Monitoring Instrument on EOS-Aura

    NASA Astrophysics Data System (ADS)

    Larko, D. E.; Mao, J.; Herman, J. R.; Huang, L.; Qin, W.; Labow, G. J.; Lloyd, S. A.; DeLand, M. T.

    2011-12-01

    The Lambert Equivalent Reflectivity (LER), derived from satellite ultraviolet (UV) radiance measurements, represents the equivalent scene reflectivity of the Earth's surface and atmosphere without Rayleigh scattering. It provides a better opportunity to quantify variations of the planetary reflectance and albedo associated with snow/ice, atmospheric aerosols and clouds, since UV reflectance is very low over most land surfaces and water. LER values at 340 nm from the Ozone Monitoring Instrument (OMI) on EOS-Aura have been generated as a new product from the OMI TO3 ozone retrieval algorithm and provided to users in HDF format. The wide field of view of OMI (~2200 km) provides complete global coverage every day with 13 km x 24 km resolution at nadir. These data are then mapped to 1 degree x 1 degree latitude-longitude grid as daily and monthly means for weather and climate studies. The OMI LER data set has been used to validate other UV LER data sets from NOAA and NASA polar orbiting satellites, and has been combined with these data sets to construct a continuous long-term data record of terrestrial UV reflectivity. This paper will present details about the data processing and format of the OMI LER product. Applications of this data set in global climate studies will be demonstrated and discussed in this presentation.

  8. How to measure a-few-nanometer-small LER occurring in EUV lithography processed feature

    NASA Astrophysics Data System (ADS)

    Kawada, Hiroki; Kawasaki, Takahiro; Kakuta, Junichi; Ikota, Masami; Kondo, Tsuyoshi

    2018-03-01

    For EUV lithography features we want to decrease the dose and/or energy of CD-SEM's probe beam because LER decreases with severe resist-material's shrink. Under such conditions, however, measured LER increases from true LER, due to LER bias that is fake LER caused by random noise in SEM image. A gap error occurs between the right and the left LERs. In this work we propose new procedures to obtain true LER by excluding the LER bias from the measured LER. To verify it we propose a LER's reference-metrology using TEM.

  9. Blue light hazard optimization for white light-emitting diode sources with high luminous efficacy of radiation and high color rendering index

    NASA Astrophysics Data System (ADS)

    Zhang, Jingjing; Guo, Weihong; Xie, Bin; Yu, Xingjian; Luo, Xiaobing; Zhang, Tao; Yu, Zhihua; Wang, Hong; Jin, Xing

    2017-09-01

    Blue light hazard of white light-emitting diodes (LED) is a hidden risk for human's photobiological safety. Recent spectral optimization methods focus on maximizing luminous efficacy and improving color performances of LEDs, but few of them take blue hazard into account. Therefore, for healthy lighting, it's urgent to propose a spectral optimization method for white LED source to exhibit low blue light hazard, high luminous efficacy of radiation (LER) and high color performances. In this study, a genetic algorithm with penalty functions was proposed for realizing white spectra with low blue hazard, maximal LER and high color rendering index (CRI) values. By simulations, white spectra from LEDs with low blue hazard, high LER (≥297 lm/W) and high CRI (≥90) were achieved at different correlated color temperatures (CCTs) from 2013 K to 7845 K. Thus, the spectral optimization method can be used for guiding the fabrication of LED sources in line with photobiological safety. It is also found that the maximum permissible exposure duration of the optimized spectra increases by 14.9% than that of bichromatic phosphor-converted LEDs with equal CCT.

  10. Experimental determination of the impact of polysilicon LER on sub-100-nm transistor performance

    NASA Astrophysics Data System (ADS)

    Patterson, Kyle; Sturtevant, John L.; Alvis, John R.; Benavides, Nancy; Bonser, Douglas; Cave, Nigel; Nelson-Thomas, Carla; Taylor, William D.; Turnquest, Karen L.

    2001-08-01

    Photoresist line edge roughness (LER) has long been feared as a potential limitation to the application of various patterning technologies to actual devices. While this concern seems reasonable, experimental verification has proved elusive and thus LER specifications are typically without solid parametric rationale. We report here the transistor device performance impact of deliberate variations of polysilicon gate LER. LER magnitude was attenuated by more than a factor of 5 by altering the photoresist type and thickness, substrate reflectivity, masking approach, and etch process. The polysilicon gate LER for nominally 70 - 150 nm devices was quantified using digital image processing of SEM images, and compared to gate leakage and drive current for variable length and width transistors. With such comparisons, realistic LER specifications can be made for a given transistor. It was found that subtle cosmetic LER differences are often not discernable electrically, thus providing hope that LER will not limit transistor performance as the industry migrates to sub-100 nm patterning.

  11. Retrieval Of Cloud Pressure And Chlorophyll Content Using Raman Scattering In GOME Ultraviolet Spectra

    NASA Technical Reports Server (NTRS)

    Atlas, Robert (Technical Monitor); Joiner, Joanna; Vasikov, Alexander; Flittner, David; Gleason, James; Bhartia, P. K.

    2002-01-01

    Reliable cloud pressure estimates are needed for accurate retrieval of ozone and other trace gases using satellite-borne backscatter ultraviolet (buv) instruments such as the global ozone monitoring experiment (GOME). Cloud pressure can be derived from buv instruments by utilizing the properties of rotational-Raman scattering (RRS) and absorption by O2-O2. In this paper we estimate cloud pressure from GOME observations in the 355-400 nm spectral range using the concept of a Lambertian-equivalent reflectivity (LER) surface. GOME has full spectral coverage in this range at relatively high spectral resolution with a very high signal-to-noise ratio. This allows for much more accurate estimates of cloud pressure than were possible with its predecessors SBUV and TOMS. We also demonstrate the potential capability to retrieve chlorophyll content with full-spectral buv instruments. We compare our retrieved LER cloud pressure with cloud top pressures derived from the infrared ATSR instrument on the same satellite. The findings confirm results from previous studies that showed retrieved LER cloud pressures from buv observations are systematically higher than IR-derived cloud-top pressure. Simulations using Mie-scattering radiative transfer algorithms that include O2-O2 absorption and RRS show that these differences can be explained by increased photon path length within and below cloud.

  12. Experimental methodology of contact edge roughness on sub-100-nm pattern

    NASA Astrophysics Data System (ADS)

    Lee, Tae Yong; Ihm, Dongchul; Kang, Hyo Chun; Lee, Jun Bum; Lee, Byoung-Ho; Chin, Soo-Bok; Cho, Do-Hyun; Kim, Yang Hyong; Yang, Ho Dong; Yang, Kyoung Mo

    2004-05-01

    The measurement of edge roughness has become a hot issue in the semiconductor industry. Major vendors offer a variety of features to measure the edge roughness in their CD-SEMs. However, most of the features are limited by the applicable pattern types. For the line and space patterns, features such as Line Edge Roughness (LER) and Line Width Roughness (LWR) are available in current CD-SEMs. The edge roughness is more critical in contact process. However the measurement of contact edge roughness (CER) or contact space roughness (CSR) is more complicated than that of LER or LWR. So far, no formal standard measurement algorithm or definition of contact roughness measurement exists. In this article, currently available features are investigated to assess their representability for CER or CSR. Some new ideas to quantify CER and CSR were also suggested with preliminary experimental results.

  13. Arabidopsis thaliana DM2h (R8) within the Landsberg RPP1-like Resistance Locus Underlies Three Different Cases of EDS1-Conditioned Autoimmunity

    PubMed Central

    Garcia, Ana V.; Wagner, Christine; Choudhury, Sayan R.; Wang, Yiming; James, Geo Velikkakam; Griebel, Thomas; Alcázar, Ruben; Tsuda, Kenichi; Schneeberger, Korbinian; Parker, Jane E.

    2016-01-01

    Plants have a large panel of nucleotide-binding/leucine rich repeat (NLR) immune receptors which monitor host interference by diverse pathogen molecules (effectors) and trigger disease resistance pathways. NLR receptor systems are necessarily under tight control to mitigate the trade-off between induced defenses and growth. Hence, mis-regulated NLRs often cause autoimmunity associated with stunting and, in severe cases, necrosis. Nucleocytoplasmic ENHANCED DISEASE SUSCEPTIBILITY1 (EDS1) is indispensable for effector-triggered and autoimmune responses governed by a family of Toll-Interleukin1-Receptor-related NLR receptors (TNLs). EDS1 operates coincidently or immediately downstream of TNL activation to transcriptionally reprogram cells for defense. We show here that low levels of nuclear-enforced EDS1 are sufficient for pathogen resistance in Arabidopsis thaliana, without causing negative effects. Plants expressing higher nuclear EDS1 amounts have the genetic, phenotypic and transcriptional hallmarks of TNL autoimmunity. In a screen for genetic suppressors of nuclear EDS1 autoimmunity, we map multiple, independent mutations to one gene, DM2h, lying within the polymorphic DANGEROUS MIX2 cluster of TNL RPP1-like genes from A. thaliana accession Landsberg erecta (Ler). The DM2 locus is a known hotspot for deleterious epistatic interactions leading to immune-related incompatibilities between A. thaliana natural accessions. We find that DM2hLer underlies two further genetic incompatibilities involving the RPP1-likeLer locus and EDS1. We conclude that the DM2hLer TNL protein and nuclear EDS1 cooperate, directly or indirectly, to drive cells into an immune response at the expense of growth. A further conclusion is that regulating the available EDS1 nuclear pool is fundamental for maintaining homeostatic control of TNL immune pathways. PMID:27082651

  14. Research in Network Management Techniques for Tactical Data Communications Network.

    DTIC Science & Technology

    1982-09-01

    the control period. Research areas include Packet Network modelling, adaptive network routing, network design algorithms, network design techniques...contro!lers are designed to perform their limited tasks optimally. For the dynamic routing problem considered here, the local controllers are node...feedback to finding in optimum stead-o-state routing (static strategies) under non - control which can be easily implemented in real time. congested

  15. Shot noise, LER, and quantum efficiency of EUV photoresists

    NASA Astrophysics Data System (ADS)

    Brainard, Robert L.; Trefonas, Peter; Lammers, Jeroen H.; Cutler, Charlotte A.; Mackevich, Joseph F.; Trefonas, Alexander; Robertson, Stewart A.

    2004-05-01

    The shot noise, line edge roughness (LER) and quantum efficiency of EUV interaction with seven resists related to EUV-2D (SP98248B) are studied. These resists were identical to EUV-2D except were prepared with seven levels of added base while keeping all other resist variables constant. These seven resists were patterned with EUV lithography, and LER was measured on 100-200 nm dense lines. Similarly, the resists were also imaged using DUV lithography and LER was determined for 300-500 nm dense lines. LER results for both wavelengths were plotted against Esize. Both curves show very similar LER behavior-the resists requiring low doses have poor LER, whereas the resists requiring high doses have good LER. One possible explanation for the observed LER response is that the added base improves LER by reacting with the photogenerated acid to control the lateral spread of acid, leading to better chemical contrast at the line edge. An alternative explanation to the observed relationship between LER and Esize is that shot-noise generated LER decreases as the number of photons absorbed at the line edge increases. We present an analytical model for the influence of shot noise based on Poisson statistics that preidicts that the LER is proportional to (Esize)-1/2. Indeed, both sets of data give straight lines when plotted this way (DUV r2 = 0.94; EUV r2 = 0.97). We decided to further evaluate this interpretation by constructing a simulation model for shot noise resulting from exposure and acid diffusion at the mask edge. In order to acquire the data for this model, we used the base titration method developed by Szmanda et al. to determine C-parameters and hence the quantum efficiency for producing photogenerated acid. This information, together with film absorptivity, allows the calculation of number and location of acid molecules generated at the mask edgte by assuming a stochastic distribution of individual photons corresponding to the aerial image function. The edge "roughness" of the acid molecule distribution in the film at the mask edge is then simulated as a function of acid diffusion length and compared to the experimental data. In addition, comparisoins between of the number of acid molecules generated and photons consumed leads to values of quantum efficiencies for these EUV resists.

  16. Mask roughness induced LER: a rule of thumb -- paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClinton, Brittany; Naulleau, Patrick

    2010-03-12

    Much work has already been done on how both the resist and line-edge roughness (LER) on the mask affect the final printed LER. What is poorly understood, however, is the extent to which system-level effects such as mask surface roughness, illumination conditions, and defocus couple to speckle at the image plane, and currently factor into LER limits. Here, we propose a 'rule-of-thumb' simplified solution that provides a fast and powerful method to obtain mask roughness induced LER. We present modeling data on an older generation mask with a roughness of 230 pm as well as the ultimate target roughness ofmore » 50 pm. Moreover, we consider feature sizes of 50 nm and 22 nm, and show that as a function of correlation length, the LER peaks at the condition that the correlation length is approximately equal to the resolution of the imaging optic.« less

  17. Impact of line edge roughness on the performance of 14-nm FinFET: Device-circuit Co-design

    NASA Astrophysics Data System (ADS)

    Rathore, Rituraj Singh; Rana, Ashwani K.

    2018-01-01

    With the evolution of sub-20 nm FinFET technology, line edge roughness (LER) has been identified as a critical problem and may result in critical device parameter variation and performance limitation in the future VLSI circuit application. In the present work, an analytical model of fin-LER has been presented, which shows the impact of correlated and uncorrelated LER on FinFET structure. Further, the influence of correlated and uncorrelated fin- LER on all electrical performance parameters is thoroughly investigated using the three-dimensional (3-D) Technology Computer Aided Design (TCAD) simulations for 14-nm technology node. Moreover, the impact of all possible fin shapes on threshold voltage (VTH), drain induced barrier lowering (DIBL), on-current (ION), and off-current (IOFF) has been compared with the well calibrated rectangular FinFET structure. In addition, the influence of all possible fin geometries on the read stability of six-transistor (6-T) Static-Random-Access-Memory (SRAM) has been investigated. The study reveals that fin-LER plays a vital role as it directly governs the electrostatics of the FinFET structure. This has been found that there is a high degree of fluctuations in all performance parameters for uncorrelated fin-LER type FinFETs as compared to correlated fin-LER with respect to rectangular FinFET structure. This paper gives physical insight of FinFET design, especially in sub-20 nm technology nodes by concluding that the impact of LER on electrical parameters are minimum for correlated LER.

  18. Effects of Surface BRDF on the OMI Cloud and NO2 Retrievals: A New Approach Based on Geometry-Dependent Lambertian Equivalent Reflectivity (GLER) Derived from MODIS

    NASA Technical Reports Server (NTRS)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    The Ozone Monitoring Instrument (OMI) cloud and NO2 algorithms use a monthly gridded surface reflectivity climatology that does not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (GLER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. GLER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from MODIS over land and the Cox Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare GLER and climatological LER at 466 nm, which is used in the OMI O2-O2cloud algorithm to derive effective cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and GLERs is carried out. GLER and corresponding retrieved cloud products are then used as input to the OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with GLERs can increase NO2 vertical columns by up to 50 % in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  19. NLR mutations suppressing immune hybrid incompatibility and their effects on disease resistance.

    PubMed

    Atanasov, Kostadin Evgeniev; Liu, Changxin; Erban, Alexander; Kopka, Joachim; Parker, Jane E; Alcázar, Rubén

    2018-05-23

    Genetic divergence between populations can lead to reproductive isolation. Hybrid incompatibilities (HI) represent intermediate points along a continuum towards speciation. In plants, genetic variation in disease resistance (R) genes underlies several cases of HI. The progeny of a cross between Arabidopsis (Arabidopsis thaliana) accessions Landsberg (Ler, Poland) and Kashmir-2 (Kas-2, central Asia) exhibits immune-related HI. This incompatibility is due to a genetic interaction between a cluster of eight TNL (TOLL/INTERLEUKIN1 RECEPTOR- NUCLEOTIDE BINDING - LEUCINE RICH REPEAT) RPP1 (RECOGNITION OF PERONOSPORA PARASITICA 1)- like genes (R1- R8) from Ler and central Asian alleles of a Strubbelig-family receptor-like kinase (SRF3) from Kas-2. In characterizing mutants altered in Ler/Kas-2 HI, we mapped multiple mutations to the RPP1-like Ler locus. Analysis of these suppressor of Ler/Kas-2 incompatibility (sulki) mutants reveals complex, additive and epistatic interactions underlying RPP1-like Ler locus activity. The effects of these mutations were measured on basal defense, global gene expression, primary metabolism, and disease resistance to a local Hyaloperonospora arabidopsidis isolate (Hpa Gw) collected from Gorzów (Gw), where the Landsberg accession originated. Gene expression sectors and metabolic hallmarks identified for HI are both dependent and independent of RPP1-like Ler members. We establish that mutations suppressing immune-related Ler/Kas-2 HI do not compromise resistance to Hpa Gw. QTL mapping analysis of Hpa Gw resistance point to RPP7 as the causal locus. This work provides insight into the complex genetic architecture of the RPP1-like Ler locus and immune-related HI in Arabidopsis and into the contributions of RPP1-like genes to HI and defense. {copyright, serif} 2018 American Society of Plant Biologists. All rights reserved.

  20. Upfront boost Gamma Knife "leading-edge" radiosurgery to FLAIR MRI-defined tumor migration pathways in 174 patients with glioblastoma multiforme: a 15-year assessment of a novel therapy.

    PubMed

    Duma, Christopher M; Kim, Brian S; Chen, Peter V; Plunkett, Marianne E; Mackintosh, Ralph; Mathews, Marlon S; Casserly, Ryan M; Mendez, Gustavo A; Furman, Daniel J; Smith, Garrett; Oh, Nathan; Caraway, Chad A; Sanathara, Ami R; Dillman, Robert O; Riley, Azzurra-Sky; Weiland, David; Stemler, Lian; Cannell, Ruslana; Abrams, Daniela Alexandru; Smith, Alexa; Owen, Christopher M; Eisenberg, Burton; Brant-Zawadzki, Michael

    2016-12-01

    OBJECTIVE Glioblastoma multiforme (GBM) is composed of cells that migrate through the brain along predictable white matter pathways. Targeting white matter pathways adjacent to, and leading away from, the original contrast-enhancing tumor site (termed leading-edge radiosurgery [LERS]) with single-fraction stereotactic radiosurgery as a boost to standard therapy could limit the spread of glioma cells and improve clinical outcomes. METHODS Between December 2000 and May 2016, after an initial diagnosis of GBM and prior to or during standard radiation therapy and carmustine or temozolomide chemotherapy, 174 patients treated with radiosurgery to the leading edge (LE) of tumor cell migration were reviewed. The LE was defined as a region outside the contrast-enhancing tumor nidus, defined by FLAIR MRI. The median age of patients was 59 years (range 22-87 years). Patients underwent LERS a median of 18 days from original diagnosis. The median target volume of 48.5 cm 3 (range 2.5-220.0 cm 3 ) of LE tissue was targeted using a median dose of 8 Gy (range 6-14 Gy) at the 50% isodose line. RESULTS The median overall survival was 23 months (mean 43 months) from diagnosis. The 2-, 3-, 5-, 7-, and 10-year actual overall survival rates after LERS were 39%, 26%, 16%, 10%, and 4%, respectively. Nine percent of patients developed treatment-related imaging-documented changes due to LERS. Nineteen percent of patients were hospitalized for management of edema, 22% for resection of a tumor cyst or new tumor bulk, and 2% for shunting to treat hydrocephalus throughout the course of their disease. Of the patients still alive, Karnofsky Performance Scale scores remained stable in 90% of patients and decreased by 1-3 grades in 10% due to symptomatic treatment-related imaging changes. CONCLUSIONS LERS is a safe and effective upfront adjunctive therapy for patients with newly diagnosed GBM. Limitations of this study include a single-center experience and single-institution determination of the LE tumor target. Use of a leading-edge calculation algorithm will be described to achieve a consistent approach to defining the LE target for general use. A multicenter trial will further elucidate its value in the treatment of GBM.

  1. Indirect DNA Readout by an H-NS Related Protein: Structure of the DNA Complex of the C-Terminal Domain of Ler

    PubMed Central

    Cordeiro, Tiago N.; Schmidt, Holger; Madrid, Cristina; Juárez, Antonio; Bernadó, Pau; Griesinger, Christian; García, Jesús; Pons, Miquel

    2011-01-01

    Ler, a member of the H-NS protein family, is the master regulator of the LEE pathogenicity island in virulent Escherichia coli strains. Here, we determined the structure of a complex between the DNA-binding domain of Ler (CT-Ler) and a 15-mer DNA duplex. CT-Ler recognizes a preexisting structural pattern in the DNA minor groove formed by two consecutive regions which are narrower and wider, respectively, compared with standard B-DNA. The compressed region, associated with an AT-tract, is sensed by the side chain of Arg90, whose mutation abolishes the capacity of Ler to bind DNA. The expanded groove allows the approach of the loop in which Arg90 is located. This is the first report of an experimental structure of a DNA complex that includes a protein belonging to the H-NS family. The indirect readout mechanism not only explains the capacity of H-NS and other H-NS family members to modulate the expression of a large number of genes but also the origin of the specificity displayed by Ler. Our results point to a general mechanism by which horizontally acquired genes may be specifically recognized by members of the H-NS family. PMID:22114557

  2. Indirect DNA readout by an H-NS related protein: structure of the DNA complex of the C-terminal domain of Ler.

    PubMed

    Cordeiro, Tiago N; Schmidt, Holger; Madrid, Cristina; Juárez, Antonio; Bernadó, Pau; Griesinger, Christian; García, Jesús; Pons, Miquel

    2011-11-01

    Ler, a member of the H-NS protein family, is the master regulator of the LEE pathogenicity island in virulent Escherichia coli strains. Here, we determined the structure of a complex between the DNA-binding domain of Ler (CT-Ler) and a 15-mer DNA duplex. CT-Ler recognizes a preexisting structural pattern in the DNA minor groove formed by two consecutive regions which are narrower and wider, respectively, compared with standard B-DNA. The compressed region, associated with an AT-tract, is sensed by the side chain of Arg90, whose mutation abolishes the capacity of Ler to bind DNA. The expanded groove allows the approach of the loop in which Arg90 is located. This is the first report of an experimental structure of a DNA complex that includes a protein belonging to the H-NS family. The indirect readout mechanism not only explains the capacity of H-NS and other H-NS family members to modulate the expression of a large number of genes but also the origin of the specificity displayed by Ler. Our results point to a general mechanism by which horizontally acquired genes may be specifically recognized by members of the H-NS family.

  3. Computational nanometrology of line-edge roughness: noise effects, cross-line correlations and the role of etch transfer

    NASA Astrophysics Data System (ADS)

    Constantoudis, Vassilios; Papavieros, George; Lorusso, Gian; Rutigliani, Vito; Van Roey, Frieda; Gogolides, Evangelos

    2018-03-01

    The aim of this paper is to investigate the role of etch transfer in two challenges of LER metrology raised by recent evolutions in lithography: the effects of SEM noise and the cross-line and edge correlations. The first comes from the ongoing scaling down of linewidths, which dictates SEM imaging with less scanning frames to reduce specimen damage and hence with more noise. During the last decade, it has been shown that image noise can be an important budget of the measured LER while systematically affects and alter the PSD curve of LER at high frequencies. A recent method for unbiased LER measurement is based on the systematic Fourier or correlation analysis to decompose the effects of noise from true LER (Fourier-Correlation filtering method). The success of the method depends on the PSD and HHCF curve. Previous experimental and model works have revealed that etch transfer affects the PSD of LER reducing its high frequency values. In this work, we estimate the noise contribution to the biased LER through PSD flat floor at high frequencies and relate it with the differences between the PSDs of lithography and etched LER. Based on this comparison, we propose an improvement of the PSD/HHCF-based method for noise-free LER measurement to include the missed high frequency real LER. The second issue is related with the increased density of lithographic patterns and the special characteristics of DSA and MP lithography patterns exhibits. In a previous work, we presented an enlarged LER characterization methodology for such patterns, which includes updated versions of the old metrics along with new metrics defined and developed to capture cross-edge and cross-line correlations. The fundamental concept has been the Line Center Roughness (LCR), the edge c-factor and the line c-factor correlation function and length quantifying the line fluctuations and the extent of cross-edge and cross-line correlations. In this work, we focus on the role of etch steps on cross-edge and line correlation metrics in SAQP data. We find that the spacer etch steps reduce edge correlations while etch steps with pattern transfer increase these. Furthermore, the density doubling and quadrupling increase edge correlations as well as cross-line correlations.

  4. LEE-encoded regulator (Ler) mutants elicit serotype-specific protection, but not cross protection, against attaching and effacing E. coli strains.

    PubMed

    Zhu, C; Feng, S; Yang, Z; Davis, K; Rios, H; Kaper, J B; Boedeker, E C

    2007-02-26

    We previously showed that single dose orogastric immunization with an attenuated regulatory Lee-encoded regulator (ler) mutant of the rabbit enteropathogenic Escherichia coli (REPEC) strain E22 (O103:H2) protected rabbits from fatal infection with the highly virulent parent strain. In the current study we assessed the degree of homologous (serotype-specific) and heterologous (cross-serotype) protection induced by immunization with REPEC ler mutant strains of differing serotypes, or with a prototype strain RDEC-1 (O15:H-) which expresses a full array of ler up-regulated proteins. We constructed an additional ler mutant using RDEC-1 thus, permitting immunization with a ler mutant of either serotype, O15 or O103, followed by challenge with a virulent REPEC strain of the same or different serotypes. Consistent with our previous data, the current study demonstrated that rabbits immunized with a RDEC-1 ler mutant were protected from challenge with virulent RDEC-H19A (RDEC-1 transduced with Shiga toxin-producing phage H19A) of the same serotype. Rabbits immunized with RDEC-1 or E22 derivative ler mutants demonstrated significant increase in serum antibody titers to the respective whole bacterial cells expressing O antigen but not to the LEE-encoded proteins. However, immunization with the ler mutants of either E22 or RDEC-1 failed to protect rabbits from infections with virulent organisms belonging to different serotypes. In contrast, rabbits immunized with the prototype RDEC-1 were cross protected against challenge with the heterologous E22 strain as shown by normal weight gain, and the absence of clinical signs of disease or characteristic attaching and effacing (A/E) lesions. Immunization with RDEC-1 induced significantly elevated serum IgG titers to LEE-encoded proteins. We thus, demonstrated homologous protection induced by the REPEC ler mutants and heterologous protection by RDEC-1. The observed correlation between elevated immune responses to the LEE-encoded proteins and the protection against challenge with heterologous virulent REPEC strain suggests that serotype-non-specific cross protection requires the expression of, and induction of antibody to, LEE-encoded virulence factors.

  5. Life Design-Ethics-Religion Studies: Non-Confessional RE in Brandenburg (Germany)

    ERIC Educational Resources Information Center

    Kenngott, Eva-Maria

    2017-01-01

    "Life Design-Ethics-Religion Studies" (LER) is the only non-confessional form of religious education (RE) in Germany. Six years after German reunification, the federal state of Brandenburg introduced LER with its dimension of non-confessional RE into the school curriculum. In this contribution, LER will be elucidated in three steps.…

  6. Multifractal analysis of line-edge roughness

    NASA Astrophysics Data System (ADS)

    Constantoudis, Vassilios; Papavieros, George; Lorusso, Gian; Rutigliani, Vito; van Roey, Frieda; Gogolides, Evangelos

    2018-03-01

    In this paper, we propose to rethink the issue of LER characterization on the basis of the fundamental concept of symmetries. In LER one can apply two kinds of symmetries: a) the translation symmetry characterized by periodicity and b) the scaling symmetry quantified by the fractal dimension. Up to now, a lot of work has been done on the first symmetry since the Power Spectral Density (PSD), which has been extensively studied recently, is a decomposition of LER signal into periodic edges and quantification of the `power' of each periodicity at the real LER. The aim of this paper is to focus on the second symmetry of scaling invariance. Similarly to PSD, we introduce the multifractal approach in LER analysis which generalizes the scaling analysis of standard (mono)fractal theory and decomposes LER into fractal edges characterized by specific fractal dimensions. The main benefit of multifractal analysis is that it enables the characterization of the multi-scaling contributions of different mechanisms involved in LER formation. In the first part of our work, we present concisely the multifractal theory of line edges and utilize the Box Counting method for its implementation and the extraction of the multifractal spectrum. Special emphasis is given on the explanation of the physical meaning of the obtained multifractal spectrum whose asymmetry quantifies the degree of multifractality. In addition, we propose the distinction between peak-based and valley-based multifractality according to whether the asymmetry of the multifractal spectrum is coming from the sharp line material peaks to space regions or from the cavities of line materis (edge valleys). In the second part, we study systematically the evolution of LER multifractal spectrum during the first successive steps of a multiple (quadruple) patterning lithography technique and find an interesting transition from a peak-based multifractal behavior in the first litho resist LER to a valley-based multifractality caused mainly by the effects of etch pattern transfer steps.

  7. Low Earth Orbit Raider (LER) winged air launch vehicle concept

    NASA Technical Reports Server (NTRS)

    Feaux, Karl; Jordan, William; Killough, Graham; Miller, Robert; Plunk, Vonn

    1989-01-01

    The need to launch small payloads into low earth orbit has increased dramatically during the past several years. The Low Earth orbit Raider (LER) is an answer to this need. The LER is an air-launched, winged vehicle designed to carry a 1500 pound payload into a 250 nautical mile orbit. The LER is launched from the back of a 747-100B at 35,000 feet and a Mach number of 0.8. Three staged solid propellant motors offer safe ground and flight handling, reliable operation, and decreased fabrication cost. The wing provides lift for 747 separation and during the first stage burn. Also, aerodynamic controls are provided to simplify first stage maneuvers. The air-launch concept offers many advantages to the consumer compared to conventional methods. Launching at 35,000 feet lowers atmospheric drag and other loads on the vehicle considerably. Since the 747 is a mobile launch pad, flexibility in orbit selection and launch time is unparalleled. Even polar orbits are accessible with a decreased payload. Most importantly, the LER launch service can come to the customer, satellites and experiments need not be transported to ground based launch facilities. The LER is designed to offer increased consumer freedom at a lower cost over existing launch systems. Simplistic design emphasizing reliability at low cost allows for the light payloads of the LER.

  8. High-Temperature Tolerance of Photosynthesis Can Be Linked to Local Electrical Responses in Leaves of Pea

    PubMed Central

    Sukhov, Vladimir; Gaspirovich, Vladimir; Mysyagin, Sergey; Vodeneev, Vladimir

    2017-01-01

    It is known that numerous stimuli induce electrical signals which can increase a plant's tolerance to stressors, including high temperature. However, the physiological role of local electrical responses (LERs), i.e., responses in the zone of stimulus action, in the plant's tolerance has not been sufficiently investigated. The aim of a current work is to analyze the connection between parameters of LERs with the thermal tolerance of photosynthetic processes in pea. Electrical activity and photosynthetic parameters in pea leaves were registered during transitions of air temperature in a measurement head (from 23 to 30°C, from 30 to 40°C, from 40 to 45°C, and from 45 to 23°C). This stepped heating decreased a photosynthetic assimilation of CO2 and induced generation of LERs in the heated leaf. Amplitudes of LERs, quantity of responses during the heating and the number of temperature transition, which induced the first generation of LERs, varied among different pea plants. Parameters of LERs were weakly connected with the photosynthetic assimilation of CO2 during the heating; however, a residual photosynthetic activity after a treatment by high temperatures increased with the growth of amplitudes and quantity of LERs and with lowering of the number of the heating transition, inducing the first electrical response. The effect was not connected with a photosynthetic activity before heating; similar dependences were also observed for effective and maximal quantum yields of photosystem II after heating. We believe that the observed effect can reflect a positive influence of LERs on the thermal tolerance of photosynthesis. It is possible that the process can participate in a plant's adaptation to stressors. PMID:29033854

  9. Modelled and measured effects of clouds on UV Aerosol Indices on a local, regional, and global scale

    NASA Astrophysics Data System (ADS)

    Penning de Vries, M.; Wagner, T.

    2011-12-01

    The UV Aerosol Indices (UVAI) form one of very few available tools in satellite remote sensing that provide information on aerosol absorption. The UVAI are also quite insensitive to surface type and are determined in the presence of clouds - situations where most aerosol retrieval algorithms do not work. The UVAI are most sensitive to elevated layers of absorbing aerosols, such as mineral dust and smoke, but they can also be used to study non-absorbing aerosols, such as sulphate and secondary organic aerosols. Although UVAI are determined for cloud-contaminated pixels, clouds do affect the value of UVAI in several ways: (1) they shield the underlying scene (potentially containing aerosols) from view, (2) they enhance the apparent surface albedo of an elevated aerosol layer, and (3) clouds unpolluted by aerosols also yield non-zero UVAI, here referred to as "cloudUVAI". The main purpose of this paper is to demonstrate that clouds can cause significant UVAI and that this cloudUVAI can be well modelled using simple assumptions on cloud properties. To this aim, we modelled cloudUVAI by using measured cloud optical parameters - either with low spatial resolution from SCIAMACHY, or high resolution from MERIS - as input. The modelled cloudUVAI were compared with UVAI determined from SCIAMACHY reflectances on different spatial (local, regional and global) and temporal scales (single measurement, daily means and seasonal means). The general dependencies of UVAI on cloud parameters were quite well reproduced, but several issues remain unclear: compared to the modelled cloudUVAI, measured UVAI show a bias, in particular for large cloud fractions. Also, the spread in measured UVAI is larger than in modelled cloudUVAI. In addition to the original, Lambert Equivalent Reflector (LER)-based UVAI algorithm, we have also investigated the effects of clouds on UVAI determined using the so-called Modified LER (MLER) algorithm (currently applied to TOMS and OMI data). For medium-sized clouds the MLER algorithm performs better (UVAI are closer to 0), but like for LER UVAI, MLER UVAI can become as large as -1.2 for small clouds and deviate significantly from zero for cloud fractions near 1. The effects of clouds should therefore also be taken into account when MLER UVAI data are used. Because the effects of clouds and aerosols on UVAI are not independent, a simple subtraction of modelled cloudUVAI from measured UVAI does not yield a UVAI representative of a cloud-free scene when aerosols are present. We here propose a first, simple approach for the correction of cloud effects on UVAI. The method is shown to work reasonably well for small to medium-sized clouds located above aerosols.

  10. Characteristics of high and low energy reporting teenagers and their relationship to low energy reporting mothers.

    PubMed

    Vågstrand, Karin; Lindroos, Anna Karin; Linné, Yvonne

    2009-02-01

    To describe the differences in socio-economic characteristics and body measurements between low, adequate and high energy reporting (LER, AER and HER) teenagers; furthermore, to investigate the relationship to misreporting mothers. Cross-sectional study. Habitual dietary intake was reported in a questionnaire. Classification into LER, AER and HER using the Goldberg equation within three activity groups based on physical activity questionnaire and calculated BMR. Stockholm, Sweden. Four hundred and forty-one 16-17-year-old teenagers (57 % girls) and their mothers. Of the teenagers, 17-19 % were classified as HER, while 13-16 % as LER. There was a highly significant trend from HER to LER in BMI (P < 0.001) and body fat % (P < 0.001). There was also a trend in number of working hours of mother (P = 0.01), family income (P = 0.008) and number of siblings (among boys only) (P = 0.02), but not in educational level of either father or mother. HER teenagers were lean, had mothers working fewer hours with lower income and had siblings. It was more likely that an LER girl had an LER mother than an AER mother (OR = 3.32; P = 0.002). The reasons for the high number of over-reporters could be many: misclassification due to growth, lacking established eating pattern due to young age or method-specific. Nevertheless, the inverted characteristic of HER compared to LER indicates that this is a specific group, worth further investigation.

  11. Root Secreted Metabolites and Proteins Are Involved in the Early Events of Plant-Plant Recognition Prior to Competition

    PubMed Central

    Badri, Dayakar V.; De-la-Peña, Clelia; Lei, Zhentian; Manter, Daniel K.; Chaparro, Jacqueline M.; Guimarães, Rejane L.; Sumner, Lloyd W.; Vivanco, Jorge M.

    2012-01-01

    The mechanism whereby organisms interact and differentiate between others has been at the forefront of scientific inquiry, particularly in humans and certain animals. It is widely accepted that plants also interact, but the degree of this interaction has been constricted to competition for space, nutrients, water and light. Here, we analyzed the root secreted metabolites and proteins involved in early plant neighbor recognition by using Arabidopsis thaliana Col-0 ecotype (Col) as our focal plant co-cultured in vitro with different neighbors [A. thaliana Ler ecotype (Ler) or Capsella rubella (Cap)]. Principal component and cluster analyses revealed that both root secreted secondary metabolites and proteins clustered separately between the plants grown individually (Col-0, Ler and Cap grown alone) and the plants co-cultured with two homozygous individuals (Col-Col, Ler-Ler and Cap-Cap) or with different individuals (Col-Ler and Col-Cap). In particularly, we observed that a greater number of defense- and stress- related proteins were secreted when our control plant, Col, was grown alone as compared to when it was co-cultured with another homozygous individual (Col-Col) or with a different individual (Col-Ler and Col-Cap). However, the total amount of defense proteins in the exudates of the co-cultures was higher than in the plant alone. The opposite pattern of expression was identified for stress-related proteins. These data suggest that plants can sense and respond to the presence of different plant neighbors and that the level of relatedness is perceived upon initial interaction. Furthermore, the role of secondary metabolites and defense- and stress-related proteins widely involved in plant-microbe associations and abiotic responses warrants reassessment for plant-plant interactions. PMID:23056382

  12. Melting Penetration Simulation of Fe-U System at High Temperature Using MPS_LER

    NASA Astrophysics Data System (ADS)

    Mustari, A. P. A.; Yamaji, A.; Irwanto, Dwi

    2016-08-01

    Melting penetration information of Fe-U system is necessary for simulating the molten core behavior during severe accident in nuclear power plants. For Fe-U system, the information is mainly obtained from experiment, i.e. TREAT experiment. However, there is no reported data on SS304 at temperature above 1350°C. The MPS_LER has been developed and validated to simulate melting penetration on Fe-U system. The MPS_LER modelled the eutectic phenomenon by solving the diffusion process and by applying the binary phase diagram criteria. This study simulates the melting penetration of the system at higher temperature using MPS_LER. Simulations were conducted on SS304 at 1400, 1450 and 1500°C. The simulation results show rapid increase of melting penetration rate.

  13. Reducing Line Edge Roughness in Si and SiN through plasma etch chemistry optimization for photonic waveguide applications

    NASA Astrophysics Data System (ADS)

    Marchack, Nathan; Khater, Marwan; Orcutt, Jason; Chang, Josephine; Holmes, Steven; Barwicz, Tymon; Kamlapurkar, Swetha; Green, William; Engelmann, Sebastian

    2017-03-01

    The LER and LWR of subtractively patterned Si and SiN waveguides was calculated after each step in the process. It was found for Si waveguides that adjusting the ratio of CF4:CHF3 during the hard mask open step produced reductions in LER of 26 and 43% from the initial lithography for isolated waveguides patterned with partial and full etches, respectively. However for final LER values of 3.0 and 2.5 nm on fully etched Si waveguides, the corresponding optical loss measurements were indistinguishable. For SiN waveguides, introduction of C4H9F to the conventional CF4/CHF3 measurement was able to reduce the mask height budget by a factor of 5, while reducing LER from the initial lithography by 26%.

  14. Line edge roughness (LER) mitigation studies specific to interference-like lithography

    NASA Astrophysics Data System (ADS)

    Baylav, Burak; Estroff, Andrew; Xie, Peng; Smith, Bruce W.

    2013-04-01

    Line edge roughness (LER) is a common problem to most lithography approaches and is seen as the main resolution limiter for advanced technology nodes1. There are several contributors to LER such as chemical/optical shot noise, random nature of acid diffusion, development process, and concentration of acid generator/base quencher. Since interference-like lithography (IL) is used to define one directional gridded patterns, some LER mitigation approaches specific to IL-like imaging can be explored. Two methods investigated in this work for this goal are (i) translational image averaging along the line direction and (ii) pupil plane filtering. Experiments regarding the former were performed on both interferometric and projection lithography systems. Projection lithography experiments showed a small amount of reduction in low/mid frequency LER value for image averaged cases at pitch of 150 nm (193 nm illumination, 0.93 NA) with less change for smaller pitches. Aerial image smearing did not significantly increase LER since it was directional. Simulation showed less than 1% reduction in NILS (compared to a static, smooth mask equivalent) with ideal alignment. In addition, description of pupil plane filtering on the transfer of mask roughness is given. When astigmatism-like aberrations were introduced in the pupil, transfer of mask roughness is decreased at best focus. It is important to exclude main diffraction orders from the filtering to prevent contrast and NILS loss. These ideas can be valuable as projection lithography approaches to conditions similar to IL (e.g. strong RET methods).

  15. Improvement in electron-beam lithography throughput by exploiting relaxed patterning fidelity requirements with directed self-assembly

    NASA Astrophysics Data System (ADS)

    Yu, Hao Yun; Liu, Chun-Hung; Shen, Yu Tian; Lee, Hsuan-Ping; Tsai, Kuen Yu

    2014-03-01

    Line edge roughness (LER) influencing the electrical performance of circuit components is a key challenge for electronbeam lithography (EBL) due to the continuous scaling of technology feature sizes. Controlling LER within an acceptable tolerance that satisfies International Technology Roadmap for Semiconductors requirements while achieving high throughput become a challenging issue. Although lower dosage and more-sensitive resist can be used to improve throughput, they would result in serious LER-related problems because of increasing relative fluctuation in the incident positions of electrons. Directed self-assembly (DSA) is a promising technique to relax LER-related pattern fidelity (PF) requirements because of its self-healing ability, which may benefit throughput. To quantify the potential of throughput improvement in EBL by introducing DSA for post healing, rigorous numerical methods are proposed to simultaneously maximize throughput by adjusting writing parameters of EBL systems subject to relaxed LER-related PF requirements. A fast, continuous model for parameter sweeping and a hybrid model for more accurate patterning prediction are employed for the patterning simulation. The tradeoff between throughput and DSA self-healing ability is investigated. Preliminary results indicate that significant throughput improvements are achievable at certain process conditions.

  16. Quick Attach Docking Interface for Lunar Electric Rover

    NASA Technical Reports Server (NTRS)

    Schuler, Jason M.; Nick, Andrew J.; Immer, Christopher; Mueller, Robert P.

    2010-01-01

    The NASA Lunar Electric Rover (LER) has been developed at Johnson Space Center as a next generation mobility platform. Based upon a twelve wheel omni-directional chassis with active suspension the LER introduces a number of novel capabilities for lunar exploration in both manned and unmanned scenarios. Besides being the primary vehicle for astronauts on the lunar surface, LER will perform tasks such as lunar regolith handling (to include dozing, grading, and excavation), equipment transport, and science operations. In an effort to support these additional tasks a team at the Kennedy Space Center has produced a universal attachment interface for LER known as the Quick Attach. The Quick Attach is a compact system that has been retro-fitted to the rear of the LER giving it the ability to dock and undock on the fly with various implements. The Quick Attach utilizes a two stage docking approach; the first is a mechanical mate which aligns and latches a passive set of hooks on an implement with an actuated cam surface on LER. The mechanical stage is tolerant to misalignment between the implement and the LER during docking and once the implement is captured a preload is applied to ensure a positive lock. The second stage is an umbilical connection which consists of a dust resistant enclosure housing a compliant mechanism that is optionally actuated to mate electrical and fluid connections for suitable implements. The Quick Attach system was designed with the largest foreseen input loads considered including excavation operations and large mass utility attachments. The Quick Attach system was demonstrated at the Desert Research And Technology Studies (D-RA TS) field test in Flagstaff, AZ along with the lightweight dozer blade LANCE. The LANCE blade is the first implement to utilize the Quick Attach interface and demonstrated the tolerance, speed, and strength of the system in a lunar analog environment.

  17. Gate line edge roughness amplitude and frequency variation effects on intra die MOS device characteristics

    NASA Astrophysics Data System (ADS)

    Hamadeh, Emad; Gunther, Norman G.; Niemann, Darrell; Rahman, Mahmud

    2006-06-01

    Random fluctuations in fabrication process outcomes such as gate line edge roughness (LER) give rise to corresponding fluctuations in scaled down MOS device characteristics. A thermodynamic-variational model is presented to study the effects of LER on threshold voltage and capacitance of sub-50 nm MOS devices. Conceptually, we treat the geometric definition of the MOS devices on a die as consisting of a collection of gates. In turn, each of these gates has an area, A, and a perimeter, P, defined by nominally straight lines subject to random process outcomes producing roughness. We treat roughness as being deviations from straightness consisting of both transverse amplitude and longitudinal wavelength each having lognormal distribution. We obtain closed-form expressions for variance of threshold voltage ( Vth), and device capacitance ( C) at Onset of Strong Inversion (OSI) for a small device. Using our variational model, we characterized the device electrical properties such as σ and σC in terms of the statistical parameters of the roughness amplitude and spatial frequency, i.e., inverse roughness wavelength. We then verified our model with numerical analysis of Vth roll-off for small devices and σ due to dopant fluctuation. Our model was also benchmarked against TCAD of σ as a function of LER. We then extended our analysis to predict variations in σ and σC versus average LER spatial frequency and amplitude, and oxide-thickness. Given the intuitive expectation that LER of very short wavelengths must also have small amplitude, we have investigated the case in which the amplitude mean is inversely related to the frequency mean. We compare with the situation in which amplitude and frequency mean are unrelated. Given also that the gate perimeter may consist of different LER signature for each side, we have extended our analysis to the case when the LER statistical difference between gate sides is moderate, as well as when it is significantly large.

  18. Lunar Surface Operations with Dual Rovers

    NASA Technical Reports Server (NTRS)

    Horz, Friedrich; Lofgren, Gary E.; Eppler, Dean E.; Ming, Douglas

    2010-01-01

    Lunar Electric Rovers (LER) are currently being developed that are substantially more capable than the Apollo vehicle (LRN ,"). Unlike the LRV, the new LERs provide a pressurized cabin that serves as short-sleeve environment for the crew of two, including sleeping accommodations and other provisions that allow for long tern stays, possibly up to 60 days, on the hear surface, without the need to replenish consumables from some outside source, such as a lander or outpost. As a consequence, significantly larger regions may be explored in the future and traverse distances may be measured in a few hundred kilometers (1, 2). However, crew safety remains an overriding concern, and methods other than "walk back", the major operational constraint of all Apollo traverses, must be implemented to assure -at any time- the safe return of the crew to the lander or outpost. This then causes current Constellation plans to envision long-tern traverses to be conducted with 2 LERs exclusively, each carrying a crew of two: in case one rover fails, the other will rescue the stranded crew and return all 4 astronauts in a single LER to base camp. Recent Desert Research and Technology Studies (DRATS) analog field tests simulated a continuous 14 day traverse (3), covering some 135 km, and included a rescue operation that transferred the crew and diverse consumables from one LER to another these successful tests add substantial realism to the development of long-term, dual rover operations. The simultaneous utilization of 2 LERs is of course totally unlike Apollo and raises interesting issues regarding science productivity and mission operations, the thrust of this note.

  19. Masking of endotoxin in surfactant samples: Effects on Limulus-based detection systems.

    PubMed

    Reich, Johannes; Lang, Pierre; Grallert, Holger; Motschmann, Hubert

    2016-09-01

    Over the last few decades Limulus Amebocyte Lysate (LAL) has been the most sensitive method for the detection of endotoxins (Lipopolysaccharides) and is well accepted in a broad field of applications. Recently, Low Endotoxin Recovery (LER) in biopharmaceutical drug products has been noticed, whereby the detection of potential endotoxin contaminations is not ensured. Notably, most of these drug products contain surfactants, which can have crucial effects on the detectability of endotoxin. In order to analyze the driving forces of LER, endotoxin detection in samples containing nonionic surfactants in various buffer systems was investigated. The results show that the process of LER is kinetically controlled and temperature-dependent. Furthermore, only the simultaneous presence of nonionic surfactants and components capable of forming metal complexes resulted in LER. In addition, capacity experiments show that even hazardous amounts of endotoxin can remain undetectable within such formulation compositions. In conclusion, the LER phenomenon is caused by endotoxin masking and not by test interference. In this process, the supramolecular structure of endotoxin is altered and exhibits only a limited susceptibility in binding to the Factor C of Limulus-based detection systems. We propose a two-step mechanism of endotoxin masking by complex forming agents and nonionic surfactants. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. Analysis of natural allelic variation at seed dormancy loci of Arabidopsis thaliana.

    PubMed Central

    Alonso-Blanco, Carlos; Bentsink, Leónie; Hanhart, Corrie J; Blankestijn-de Vries, Hetty; Koornneef, Maarten

    2003-01-01

    Arabidopsis accessions differ largely in their seed dormancy behavior. To understand the genetic basis of this intraspecific variation we analyzed two accessions: the laboratory strain Landsberg erecta (Ler) with low dormancy and the strong-dormancy accession Cape Verde Islands (Cvi). We used a quantitative trait loci (QTL) mapping approach to identify loci affecting the after-ripening requirement measured as the number of days of seed dry storage required to reach 50% germination. Thus, seven QTL were identified and named delay of germination (DOG) 1-7. To confirm and characterize these loci, we developed 12 near-isogenic lines carrying single and double Cvi introgression fragments in a Ler genetic background. The analysis of these lines for germination in water confirmed four QTL (DOG1, DOG2, DOG3, and DOG6) as showing large additive effects in Ler background. In addition, it was found that DOG1 and DOG3 genetically interact, the strong dormancy determined by DOG1-Cvi alleles depending on DOG3-Ler alleles. These genotypes were further characterized for seed dormancy/germination behavior in five other test conditions, including seed coat removal, gibberellins, and an abscisic acid biosynthesis inhibitor. The role of the Ler/Cvi allelic variation in affecting dormancy is discussed in the context of current knowledge of Arabidopsis germination. PMID:12807791

  1. Analysis of natural allelic variation at seed dormancy loci of Arabidopsis thaliana.

    PubMed

    Alonso-Blanco, Carlos; Bentsink, Leónie; Hanhart, Corrie J; Blankestijn-de Vries, Hetty; Koornneef, Maarten

    2003-06-01

    Arabidopsis accessions differ largely in their seed dormancy behavior. To understand the genetic basis of this intraspecific variation we analyzed two accessions: the laboratory strain Landsberg erecta (Ler) with low dormancy and the strong-dormancy accession Cape Verde Islands (Cvi). We used a quantitative trait loci (QTL) mapping approach to identify loci affecting the after-ripening requirement measured as the number of days of seed dry storage required to reach 50% germination. Thus, seven QTL were identified and named delay of germination (DOG) 1-7. To confirm and characterize these loci, we developed 12 near-isogenic lines carrying single and double Cvi introgression fragments in a Ler genetic background. The analysis of these lines for germination in water confirmed four QTL (DOG1, DOG2, DOG3, and DOG6) as showing large additive effects in Ler background. In addition, it was found that DOG1 and DOG3 genetically interact, the strong dormancy determined by DOG1-Cvi alleles depending on DOG3-Ler alleles. These genotypes were further characterized for seed dormancy/germination behavior in five other test conditions, including seed coat removal, gibberellins, and an abscisic acid biosynthesis inhibitor. The role of the Ler/Cvi allelic variation in affecting dormancy is discussed in the context of current knowledge of Arabidopsis germination.

  2. Separating the optical contributions to line-edge roughness in EUV lithography using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Chunder, Anindarupa; Latypov, Azat; Chen, Yulu; Biafore, John J.; Levinson, Harry J.; Bailey, Todd

    2017-03-01

    Minimization and control of line-edge roughness (LER) and contact-edge roughness (CER) is one of the current challenges limiting EUV line-space and contact hole printability. One significant contributor to feature roughness and CD variability in EUV is photon shot noise (PSN); others are the physical and chemical processes in photoresists, known as resist stochastic effect. Different approaches are available to mitigate each of these contributions. In order to facilitate this mitigation, it is important to assess the magnitude of each of these contributions separately from others. In this paper, we present and test a computational approach based on the concept of an `ideal resist'. An ideal resist is assumed to be devoid of all resist stochastic effects. Hence, such an ideal resist can only be simulated as an `ideal resist model' (IRM) through explicit utilization of the Poisson statistics of PSN2 or direct Monte Carlo simulation of photon absorption in resist. LER estimated using IRM, thus quantifies the exclusive contribution of PSN to LER. The result of the simulation study done using IRM indicates higher magnitude of contribution (60%) from PSN to LER with respect to total or final LER for a sufficiently optimized high dose `state of the art' EUV chemically amplified resist (CAR) model.

  3. Ecotypic variability in the metabolic response of seeds to diurnal hydration-dehydration cycles and its relationship to seed vigor.

    PubMed

    Bai, Bing; Sikron, Noga; Gendler, Tanya; Kazachkova, Yana; Barak, Simon; Grafi, Gideon; Khozin-Goldberg, Inna; Fait, Aaron

    2012-01-01

    Seeds in the seed bank experience diurnal cycles of imbibition followed by complete dehydration. These conditions pose a challenge to the regulation of germination. The effect of recurring hydration-dehydration (Hy-Dh) cycles were tested on seeds from four Arabidopsis thaliana accessions [Col-0, Cvi, C24 and Ler]. Diurnal Hy-Dh cycles had a detrimental effect on the germination rate and on the final percentage of germination in Col-0, Cvi and C24 ecotypes, but not in the Ler ecotype, which showed improved vigor following the treatments. Membrane permeability measured by ion conductivity was generally increased following each Hy-Dh cycle and was correlated with changes in the redox status represented by the GSSG/GSH (oxidized/reduced glutathione) ratio. Among the ecotypes, Col-0 seeds displayed the highest membrane permeability, whilst Ler was characterized by the greatest increase in electrical conductivity following Hy-Dh cycles. Following Dh 2 and Dh 3, the respiratory activity of Ler seeds significantly increased, in contrast to the other ecotypes, indicative of a dramatic shift in metabolism. These differences were associated with accession-specific content and patterns of change of (i) cell wall-related laminaribiose and mannose; (ii) fatty acid composition, specifically of the unsaturated oleic acid and α-linoleic acid; and (iii) asparagine, ornithine and the related polyamine putrescine. Furthermore, in the Ler ecotype the content of the tricarboxylic acid (TCA) cycle intermediates fumarate, succinate and malate increased in response to dehydration, in contrast to a decrease in the other three ecotypes. These findings provide a link between seed respiration, energy metabolism, fatty acid β-oxidation, nitrogen mobilization and membrane permeability and the improved germination of Ler seeds following Hy-Dh cycles.

  4. Accession-dependent action potentials in Arabidopsis.

    PubMed

    Favre, Patrick; Greppin, Hubert; Degli Agosti, Robert

    2011-05-01

    Plant excitability, as measured by the appearance and circulation of action potentials (APs) after biotic and abiotic stress treatments, is a far lesser and more versatile phenomenon than in animals. To examine the genetic basis of plant excitability we used different Arabidopsis thaliana accessions. APs were induced by wounding (W) with a subsequent deposition (D) of 5μL of 1M KCl onto adult leaves. This treatment elicited transient voltage responses (APs) that were detected by 2 extracellular electrodes placed at a distance from the wounding location over an experimental time of 150min. The first electrode (e1) was placed at the end of the petiole and the beginning of the leaf, and the second (e2) electrode was placed on the petiole near the center of the rosette. All accessions (Columbia (Col), Wassilewskija (Ws) and Landsberg erecta (Ler)) responded to the W & D treatment. After W & D treatment was performed on 100 plants for each accession, the number of APs ranged from 0 to 37 (median 8, total 940), 0 to 16 (median 5, total 528) and 0 to 18 (median 2, total 296) in Col, Ws and Ler, respectively. Responding plants (>0 APs) showed significantly different behaviors depending on their accessions of origin (i.e., Col 91, Ws 83 and Ler 76%). Some AP characteristics, such as amplitude and speed of propagation from e1 to e2 (1.28mms(-1)), were the same for all accessions, whereas the average duration of APs was similar in Col and Ws, but different in Ler. Self-sustained oscillations were observed more frequently in Col than Ws and least often in Ler, and the mean oscillation frequency was more rapid in Col, followed by Ws, and was slowest in Ler. In general, Col was the most excitable accession, followed by Ws, and Ler was the least excitable; this corresponded well with voltage elicited action potentials. In conclusion, part of Arabidopsis excitability in AP responses is genetically pre-determined. Copyright © 2010 Elsevier GmbH. All rights reserved.

  5. Natural variation of H3K27me3 distribution between two Arabidopsis accessions and its association with flanking transposable elements

    PubMed Central

    2012-01-01

    Background Histone H3 lysine 27 tri-methylation and lysine 9 di-methylation are independent repressive chromatin modifications in Arabidopsis thaliana. H3K27me3 is established and maintained by Polycomb repressive complexes whereas H3K9me2 is catalyzed by SUVH histone methyltransferases. Both modifications can spread to flanking regions after initialization and were shown to be mutually exclusive in Arabidopsis. Results We analyzed the extent of natural variation of H3K27me3 in the two accessions Landsberg erecta (Ler) and Columbia (Col) and their F1 hybrids. The majority of H3K27me3 target genes in Col were unchanged in Ler and F1 hybrids. A small number of Ler-specific targets were detected and confirmed. Consistent with a cis-regulatory mechanism for establishing H3K27me3, differential targets showed allele-specific H3K27me3 in hybrids. Five Ler-specific targets showed the active mark H3K4me3 in Col and for this group, differential H3K27me3 enrichment accorded to expression variation. On the other hand, the majority of Ler-specific targets were not expressed in Col, Ler or 17 other accessions. Instead of H3K27me3, the antagonistic mark H3K9me2 and other heterochromatic features were observed at these loci in Col. These loci were frequently flanked by transposable elements, which were often missing in the Ler genome assembly. Conclusion There is little variation in H3K27me3 occupancy within the species, although H3K27me3 targets were previously shown as overrepresented among differentially expressed genes. The existing variation in H3K27me3 seems mostly explained by flanking polymorphic transposable elements. These could nucleate heterochromatin, which then spreads into neighboring H3K27me3 genes, thus converting them to H3K9me2 targets. PMID:23253144

  6. The Arabidopsis mutant, fy-1, has an ABA-insensitive germination phenotype.

    PubMed

    Jiang, Shiling; Kumar, Santosh; Eu, Young-Jae; Jami, Sravan Kumar; Stasolla, Claudio; Hill, Robert D

    2012-04-01

    Arabidopsis FY, a homologue of the yeast RNA 3' processing factor Pfs2p, regulates the autonomous floral transition pathway through its interaction with FCA, an RNA binding protein. It is demonstrated here that FY also influences seed dormancy. Freshly-harvested seed of the Arabidopsis fy-1 mutant germinated readily in the absence of stratification or after-ripening. Furthermore, the fy-1 mutant showed less ABA sensitivity compared with the wild type, Ler, under identical conditions. Freshly-harvested seed of fy-1 had significantly higher ABA levels than Ler, even though Ler was dormant and fy-1 germinated readily. The PPLPP domains of FY, which are required for flowering control, were not essential for the ABA-influenced repression of germination. FLC expression analysis in seeds of different genotypes suggested that the effect of FY on dormancy may not be elicited through FLC. No significant differences in CYP707A1, CYP707A2, NCED9, ABI3, and ABI4 were observed between freshly-harvested Ler and fy-1 imbibed for 48 h. GA3ox1 and GA3ox2 rapidly increased over the 48 h imbibition period for fy-1, with no significant increases in these transcripts for Ler. ABI5 levels were significantly lower in fy-1 over the 48 h imbibition period. The results suggest that FY is involved in the development of dormancy and ABA sensitivity in Arabidopsis seed.

  7. Determining Window Placement and Configuration for the Small Pressurized Rover (SPR)

    NASA Technical Reports Server (NTRS)

    Thompson, Shelby; Litaker, Harry; Howard, Robert

    2009-01-01

    This slide presentation reviews the process of the evaluation of window placement and configuration for the cockpit of the Lunar Electric Rover (LER). The purpose of the evaluation was to obtain human-in-the-loop data on window placement and configuration for the cockpit of the LER.

  8. Layout of the LER (Low Energy Ring) Arc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutton, A.

    We have recently been trying to accumulate all of the information necessary to decide on the layout of the regular curved arcs of the Low Energy Ring (LER) and there have been several ABC Notes published on different aspects of the problem. This note will describe the layout that has been derived from these considerations.

  9. Reduction of Line Edge Roughness of Polystyrene-block-Poly(methyl methacrylate) Copolymer Nanopatterns By Introducing Hydrogen Bonding at the Junction Point of Two Block Chains.

    PubMed

    Lee, Kyu Seong; Lee, Jaeyong; Kwak, Jongheon; Moon, Hong Chul; Kim, Jin Kon

    2017-09-20

    To apply well-defined block copolymer nanopatterns to next-generation lithography or high-density storage devices, small line edge roughness (LER) of nanopatterns should be realized. Although polystyrene-block-poly(methyl methacrylate) copolymer (PS-b-PMMA) has been widely used to fabricate nanopatterns because of easy perpendicular orientation of the block copolymer nanodomains and effective removal of PMMA block by dry etching, the fabricated nanopatterns show poorer line edge roughness (LER) due to relatively small Flory-Huggins interaction parameter (χ) between PS and PMMA chains. Here, we synthesized PS-b-PMMA with urea (U) and N-(4-aminomethyl-benzyl)-4-hydroxymethyl-benzamide (BA) moieties at junction of PS and PMMA chains (PS-U-BA-PMMA) to improve the LER. The U-BA moieties serves as favorable interaction (hydrogen bonding) sites. The LER of PS line patterns obtained from PS-U-BA-PMMA was reduced ∼25% compared with that obtained from neat PS-b-PMMA without BA and U moieties. This is attributed to narrower interfacial width induced by hydrogen bonding between two blocks, which is confirmed by small-angle X-ray scattering. This result implies that the introduction of hydrogen bonding into block copolymer interfaces offers an opportunity to fabricate well-defined nanopatterns with improved LER by block copolymer self-assembly, which could be a promising alternative to next-generation extreme ultraviolet lithography.

  10. The Arabidopsis mutant, fy-1, has an ABA-insensitive germination phenotype

    PubMed Central

    Jiang, Shiling; Kumar, Santosh; Eu, Young-Jae; Jami, Sravan Kumar; Stasolla, Claudio; Hill, Robert D.

    2012-01-01

    Arabidopsis FY, a homologue of the yeast RNA 3' processing factor Pfs2p, regulates the autonomous floral transition pathway through its interaction with FCA, an RNA binding protein. It is demonstrated here that FY also influences seed dormancy. Freshly-harvested seed of the Arabidopsis fy-1 mutant germinated readily in the absence of stratification or after-ripening. Furthermore, the fy-1 mutant showed less ABA sensitivity compared with the wild type, Ler, under identical conditions. Freshly-harvested seed of fy-1 had significantly higher ABA levels than Ler, even though Ler was dormant and fy-1 germinated readily. The PPLPP domains of FY, which are required for flowering control, were not essential for the ABA-influenced repression of germination. FLC expression analysis in seeds of different genotypes suggested that the effect of FY on dormancy may not be elicited through FLC. No significant differences in CYP707A1, CYP707A2, NCED9, ABI3, and ABI4 were observed between freshly-harvested Ler and fy-1 imbibed for 48 h. GA3ox1 and GA3ox2 rapidly increased over the 48 h imbibition period for fy-1, with no significant increases in these transcripts for Ler. ABI5 levels were significantly lower in fy-1 over the 48 h imbibition period. The results suggest that FY is involved in the development of dormancy and ABA sensitivity in Arabidopsis seed. PMID:22282534

  11. Zu einer inhaltsorientierten Theorie des Lernens und Lehrens der biologischen Evolution

    NASA Astrophysics Data System (ADS)

    Wallin, Anita

    Der Zweck dieser Studie (zwecks Überblick siehe dazu Abb. 9.1) war zu untersuchen, wie die Schüler der Sekundarstufe II ein Verständnis von der Theorie der biologischen Evolution entwickeln. Vom Ausgangspunkt "Vorurteile der Schüler“ ausgehend wurden Unterrichtssequenzen entwickelt und drei verschiedene Lernexperimente in einem zyklischen Prozess durchgeführt. Das Wissen der Schüler wurde vor, während und nach den Unterrichtssequenzen mit Hilfe von schriftlichen Tests, Interviews und Diskussionsrunden in kleinen Gruppen abgefragt. Etwa 80 % der Schüler hatten vor dem Unterricht alternative Vorstellungen von Evolution, und in dem Nachfolgetest erreichten circa 75 % ein wissenschaftliches Niveau. Die Argumentation der Schüler in den verschiedenen Tests wurde sorgfältig unter Rücksichtnahme auf Vorurteile, der konzeptionellen Struktur der Theorie der Evolution und den Zielen des Unterrichts analysiert. Daraus konnten Einsichten in solche Anforderungen an Lehren und Lernen gewonnen werden, die Herausforderungen an Schüler und Lehrer darstellen, wenn sie anfangen, evolutionäre Biologie zu lernen oder zu lehren. Ein wichtiges Ergebnis war, dass das Verständnis existierender Variation in einer Population der Schlüssel zum Verständnis von natürlicher Selektion ist. Die Ergebnisse sind in einer inhaltsorientierten Theorie zusammengefasst, welche aus drei verschiedenen Aspekten besteht: 1) den inhaltsspezifischen Aspekten, die einzigartig für jedes wissenschaftliche Feld sind; 2) den Aspekten, die die Natur der Wissenschaft betreffen; und 3) den allgemeinen Aspekten. Diese Theorie kann in neuen Experimenten getestet und weiter entwickelt werden.

  12. Validation of the high-throughput marker technology DArT using the model plant Arabidopsis thaliana.

    PubMed

    Wittenberg, Alexander H J; van der Lee, Theo; Cayla, Cyril; Kilian, Andrzej; Visser, Richard G F; Schouten, Henk J

    2005-08-01

    Diversity Arrays Technology (DArT) is a microarray-based DNA marker technique for genome-wide discovery and genotyping of genetic variation. DArT allows simultaneous scoring of hundreds of restriction site based polymorphisms between genotypes and does not require DNA sequence information or site-specific oligonucleotides. This paper demonstrates the potential of DArT for genetic mapping by validating the quality and molecular basis of the markers, using the model plant Arabidopsis thaliana. Restriction fragments from a genomic representation of the ecotype Landsberg erecta (Ler) were amplified by PCR, individualized by cloning and spotted onto glass slides. The arrays were then hybridized with labeled genomic representations of the ecotypes Columbia (Col) and Ler and of individuals from an F(2) population obtained from a Col x Ler cross. The scoring of markers with specialized software was highly reproducible and 107 markers could unambiguously be ordered on a genetic linkage map. The marker order on the genetic linkage map coincided with the order on the DNA sequence map. Sequencing of the Ler markers and alignment with the available Col genome sequence confirmed that the polymorphism in DArT markers is largely a result of restriction site polymorphisms.

  13. Blue light hazard performance comparison of phosphor-converted LED sources with red quantum dots and red phosphor

    NASA Astrophysics Data System (ADS)

    Zhang, Jingjing; Xie, Bin; Yu, Xingjian; Luo, Xiaobing; Zhang, Tao; Liu, Shishen; Yu, Zhihua; Liu, Li; Jin, Xing

    2017-07-01

    In this study, the blue light hazard performances of phosphor converted-light-emitting diodes (pc-LEDs) with red phosphor and red quantum dots (QDs) were compared and analyzed by spectral optimization, which boosts the minimum attainable blue light hazard efficiency of radiation (BLHER) at high values of color rendering index (CRI) and luminous efficacy of radiation (LER) when the correlated color temperature (CCT) value changes from 1800 to 7800 K. It is found that the minimal BLHER value increases with the increase in the CCT value, and the minimal BLHER values of the two spectral models are nearly the same. Note that the QDs' model has advantages at CCT coverage under the same constraints of CRI and LER. Then, the relationships between minimal BLHER, CRI, CCT, and LER of pc-LEDs with QDs' model were analyzed. It is found that the minimal BLHER values are nearly the same when the CRI value changes from 50 to 90. Therefore, the influence of CRI on minimal BLHER is insignificant. Minimal BLHER increases with the increase in the LER value from 240 to 360 lm/W.

  14. Assessment of risk for asthma initiation and cancer and heart disease deaths among patrons and servers due to secondhand smoke exposure in restaurants and bars

    PubMed Central

    Liu, Ruiling; Bohac, David L; Gundel, Lara A; Hewett, Martha J; Apte, Michael G; Hammond, S Katharine

    2014-01-01

    Background Despite efforts to reduce exposure to secondhand smoke (SHS), only 5% of the world's population enjoy smoke-free restaurants and bars. Methods Lifetime excess risk (LER) of cancer death, ischaemic heart disease (IHD) death and asthma initiation among non-smoking restaurant and bar servers and patrons in Minnesota and the US were estimated using weighted field measurements of SHS constituents in Minnesota, existing data on tobacco use and multiple dose-response models. Results A continuous approach estimated a LER of lung cancer death (LCD) of 18×10−6(95% CI 13 to 23×10−6) for patrons visiting only designated non-smoking sections, 80×10−6(95% CI 66 to 95×10−6) for patrons visiting only smoking venues/sections and 802×10−6(95% CI 658 to 936×10−6) for servers in smoking-permitted venues. An attributable-risk (exposed/non-exposed) approach estimated a similar LER of LCD, a LER of IHD death about 10−2 for non-smokers with average SHS exposure from all sources and a LER of asthma initiation about 5% for servers with SHS exposure at work only. These risks correspond to 214 LCDs and 3001 IHD deaths among the general non-smoking population and 1420 new asthma cases among non-smoking servers in the US each year due to SHS exposure in restaurants and bars alone. Conclusions Health risks for patrons and servers from SHS exposure in restaurants and bars alone are well above the acceptable level. Restaurants and bars should be a priority for governments’ effort to create smoke-free environments and should not be exempt from smoking bans. PMID:23407112

  15. Low-energy reporting in women at risk for breast cancer recurrence. Women's Healthy Eating and Living Group.

    PubMed

    Caan, B J; Flatt, S W; Rock, C L; Ritenbaugh, C; Newman, V; Pierce, J P

    2000-10-01

    This study examined the extent of low-energy reporting and its relationship with demographic and lifestyle factors in women previously treated for breast cancer. This study used data from a large multisite clinical trial testing the efficacy of a dietary intervention to reduce risk for breast cancer recurrence (Women's Healthy Eating and Living Study). Using the Schofield equation to estimate energy needs and four 24-h dietary recalls to estimate energy intakes, we identified women who reported lower than expected energy intakes using criteria developed by G. R. Goldberg et al. (Eur. J. Clin. Nutr., 45: 569-581, 1991). We examined data from 1137 women diagnosed with stage I, stage II, or stage IIIA primary, operable breast cancer. Women were 18-70 years of age at diagnosis and were enrolled in the Women's Healthy Eating and Living Study between August 19, 1995, and April 1, 1998, within 4 years after diagnosis. The Goldberg criteria classified about one-quarter (25.6%) as low-energy reporters (LERs) and 10.8% as very LERs. Women who had a body mass index >30 were almost twice (odds ratio, 1.95) as likely to be LERs. Women with a history of weight gain or weight fluctuations were one and a half times as likely (odds ratio, 1.55) to be LERs as those who were weight stable or weight losers. Age, ethnicity, alcohol intake, supplement use, and exercise level were also related to LER. Characteristics (such as body mass index, age, ethnicity, and weight history) that are associated with low-energy reporting in this group of cancer survivors are similar to those observed in other populations and might affect observed diet and breast cancer associations in epidemiological studies.

  16. Gigapan Voyage for Robotic Recon

    NASA Technical Reports Server (NTRS)

    Lee, Susan Y.; Moorse, Theodore Fitzgerald; Park, Eric J.

    2010-01-01

    Gigapan Voyage (GV) is a self-contained remotely-operable Gigapan capturing system that is currently being developed by the Intelligent Robotics Group (IRG) at NASA Ames Research Center. Gigapan Voyage was primarily designed to be integrated onto Johnson Space Center s Lunar Electric Rovers (LER). While on LER, Gigapan Voyage was used by scientists and astronauts during the 2009 and 2010 Desert RATS field tests. The concept behind Gigapan Voyage is to merge all the sub-components of the commercial GigaPan system into an all-in-one system that can capture, stitch, and display Gigapans in an automated way via a simple web interface. The GV system enables NASA to quickly and easily add remote-controlled Gigapan capturing capability onto rovers with minimal integration effort. Key Words: Geology, NASA, Black Point Lava Flow, Robot, K10, LER, Gigapan Voyage, Desert RATS, Intelligent Robotics Group

  17. A Compilation of Boiling Water Reactor Operational Experience for the United Kingdom's Office for Nuclear Regulation's Advanced Boiling Water Reactor Generic Design Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, Timothy A.; Liao, Huafei

    2014-12-01

    United States nuclear power plant Licensee Event Reports (LERs), submitted to the United States Nuclear Regulatory Commission (NRC) under law as required by 10 CFR 50.72 and 50.73 were evaluated for reliance to the United Kingdom’s Health and Safety Executive – Office for Nuclear Regulation’s (ONR) general design assessment of the Advanced Boiling Water Reactor (ABWR) design. An NRC compendium of LERs, compiled by Idaho National Laboratory over the time period January 1, 2000 through March 31, 2014, were sorted by BWR safety system and sorted into two categories: those events leading to a SCRAM, and those events which constitutedmore » a safety system failure. The LERs were then evaluated as to the relevance of the operational experience to the ABWR design.« less

  18. Theoretical study on sensitivity enhancement in energy-deficit region of chemically amplified resists used for extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2017-10-01

    The role of photons in lithography is to transfer the energy and information required for resist pattern formation. In the information-deficit region, a trade-off relationship is observed between line edge roughness (LER) and sensitivity. However, the sensitivity can be increased without increasing LER in the energy-deficit region. In this study, the sensitivity enhancement limit was investigated, assuming line-and-space patterns with a half-pitch of 11 nm. LER was calculated by a Monte Carlo method. It was unrealistic to increase the sensitivity twofold while keeping the line width roughness (LWR) within 10% critical dimension (CD), whereas the twofold sensitivity enhancement with 20% CD LWR was feasible. The requirements are roughly that the sensitization distance should be less than 2 nm and that the total sensitizer concentration should be higher than 0.3 nm-3.

  19. Design of an Integrated Division-Level Battle Simulation for Research, Development, and Training. Volume 2. Detailed Design Notes

    DTIC Science & Technology

    1979-08-01

    frag orders for tactical considerations. Frag orders issued by simulated modules will be " edited " by the same procedure as that used with populated...record and distributed as required. Queries transmitted from any staff module will be reviewed and edited at event time for technical accuracy. If an...of this kind will have to be carefully edited and interpreted by the control- ler(s) and/or computer before the chanqe is instituted in the real world

  20. Allowable SEM noise for unbiased LER measurement

    NASA Astrophysics Data System (ADS)

    Papavieros, George; Constantoudis, Vassilios; Gogolides, Evangelos

    2018-03-01

    Recently, a novel method for the calculation of unbiased Line Edge Roughness based on Power Spectral Density analysis has been proposed. In this paper first an alternative method is discussed and investigated, utilizing the Height-Height Correlation Function (HHCF) of edges. The HHCF-based method enables the unbiased determination of the whole triplet of LER parameters including besides rms the correlation length and roughness exponent. The key of both methods is the sensitivity of PSD and HHCF on noise at high frequencies and short distance respectively. Secondly, we elaborate a testbed of synthesized SEM images with controlled LER and noise to justify the effectiveness of the proposed unbiased methods. Our main objective is to find out the boundaries of the method in respect to noise levels and roughness characteristics, for which the method remains reliable, i.e the maximum amount of noise allowed, for which the output results cope with the controllable known inputs. At the same time, we will also set the extremes of roughness parameters for which the methods hold their accuracy.

  1. Prevalence of Traditional and Reverse-Algorithm Syphilis Screening in Laboratory Practice: A Survey of Participants in the College of American Pathologists Syphilis Serology Proficiency Testing Program.

    PubMed

    Rhoads, Daniel D; Genzen, Jonathan R; Bashleben, Christine P; Faix, James D; Ansari, M Qasim

    2017-01-01

    -Syphilis serology screening in laboratory practice is evolving. Traditionally, the syphilis screening algorithm begins with a nontreponemal immunoassay, which is manually performed by a laboratory technologist. In contrast, the reverse algorithm begins with a treponemal immunoassay, which can be automated. The Centers for Disease Control and Prevention has recognized both approaches, but little is known about the current state of laboratory practice, which could impact test utilization and interpretation. -To assess the current state of laboratory practice for syphilis serologic screening. -In August 2015, a voluntary questionnaire was sent to the 2360 laboratories that subscribe to the College of American Pathologists syphilis serology proficiency survey. -Of the laboratories surveyed, 98% (2316 of 2360) returned the questionnaire, and about 83% (1911 of 2316) responded to at least some questions. Twenty-eight percent (378 of 1364) reported revision of their syphilis screening algorithm within the past 2 years, and 9% (170 of 1905) of laboratories anticipated changing their screening algorithm in the coming year. Sixty-three percent (1205 of 1911) reported using the traditional algorithm, 16% (304 of 1911) reported using the reverse algorithm, and 2.5% (47 of 1911) reported using both algorithms, whereas 9% (169 of 1911) reported not performing a reflex confirmation test. Of those performing the reverse algorithm, 74% (282 of 380) implemented a new testing platform when introducing the new algorithm. -The majority of laboratories still perform the traditional algorithm, but a significant minority have implemented the reverse-screening algorithm. Although the nontreponemal immunologic response typically wanes after cure and becomes undetectable, treponemal immunoassays typically remain positive for life, and it is important for laboratorians and clinicians to consider these assay differences when implementing, using, and interpreting serologic syphilis screening algorithms.

  2. Measurements and sensitivities of LWR in poly spacers

    NASA Astrophysics Data System (ADS)

    Ayal, Guy; Shauly, Eitan; Levi, Shimon; Siany, Amit; Adan, Ofer; Shacham-Diamand, Yosi

    2010-03-01

    LER and LWR have long been considered a primary issue in process development and monitoring. Development of a low power process flavors emphasizes the effect of LER, LWR on different aspects of the device. Gate level performance, particularly leakage current at the front end of line, resistance and reliability in the back-end layers. Traditionally as can be seen in many publications, for the front end of line the focus is mainly on Poly and Active area layers. Poly spacers contribution to the gate leakage, for example, is rarely discussed. Following our research done on sources of gate leakage, we found leakage current (Ioff) in some processes to be highly sensitive to changes in the width of the Poly spacers - even more strongly to the actual Poly gate CDs. Therefore we decided to measure Poly spacers LWR, its correlation to the LWR in the poly, and its sensitivity to changes in layout and OPC. In our last year publication, we defined the terms LLER (Local Line Edge Roughness) and LLWR (Local Line Width Roughness). The local roughness is measured as the 3-sigma value of the line edge/width in a 5-nm segment around the measurement point. We will use these terms in this paper to evaluate the Poly roughness impact on Poly spacer's roughness. A dedicated test chip was designed for the experiments, having various transistors layout configurations with different densities to cover the all range of process design rules. Applied Materials LER and LWR innovative algorithms were used to measure and characterize the spacer roughness relative to the distance from the active edges and from other spaces. To accurately measure all structures in a reasonable time, the recipes were automatically generated from CAD. On silicon, after poly spacers generation, the transistors no longer resemble the Poly layer CAD layout, their morphology is different compared with Photo/Etch traditional structures , and dimensions vary significantly. In this paper we present metrology and characterization of poly spacer LLWR and LLER compared to that of the poly gate in various transistor shapes, showing that the relation between them depends on the transistor architecture (final layout, including OPC). We will show how the spacer deposition may reduce, keep or even enlarge the roughness measured on Poly, depending on transistor layout , but surprisingly, not dependent on proximity effects.

  3. The tradition algorithm approach underestimates the prevalence of serodiagnosis of syphilis in HIV-infected individuals.

    PubMed

    Chen, Bin; Peng, Xiuming; Xie, Tiansheng; Jin, Changzhong; Liu, Fumin; Wu, Nanping

    2017-07-01

    Currently, there are three algorithms for screening of syphilis: traditional algorithm, reverse algorithm and European Centre for Disease Prevention and Control (ECDC) algorithm. To date, there is not a generally recognized diagnostic algorithm. When syphilis meets HIV, the situation is even more complex. To evaluate their screening performance and impact on the seroprevalence of syphilis in HIV-infected individuals, we conducted a cross-sectional study included 865 serum samples from HIV-infected patients in a tertiary hospital. Every sample (one per patient) was tested with toluidine red unheated serum test (TRUST), T. pallidum particle agglutination assay (TPPA), and Treponema pallidum enzyme immunoassay (TP-EIA) according to the manufacturer's instructions. The results of syphilis serological testing were interpreted following different algorithms respectively. We directly compared the traditional syphilis screening algorithm with the reverse syphilis screening algorithm in this unique population. The reverse algorithm achieved remarkable higher seroprevalence of syphilis than the traditional algorithm (24.9% vs. 14.2%, p < 0.0001). Compared to the reverse algorithm, the traditional algorithm also had a missed serodiagnosis rate of 42.8%. The total percentages of agreement and corresponding kappa values of tradition and ECDC algorithm compared with those of reverse algorithm were as follows: 89.4%,0.668; 99.8%, 0.994. There was a very good strength of agreement between the reverse and the ECDC algorithm. Our results supported the reverse (or ECDC) algorithm in screening of syphilis in HIV-infected populations. In addition, our study demonstrated that screening of HIV-populations using different algorithms may result in a statistically different seroprevalence of syphilis.

  4. Three-dimensional profile extraction from CD-SEM image and top/bottom CD measurement by line-edge roughness analysis

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Atsuko; Ohashi, Takeyoshi; Kawasaki, Takahiro; Inoue, Osamu; Kawada, Hiroki

    2013-04-01

    A new method for calculating critical dimension (CDs) at the top and bottom of three-dimensional (3D) pattern profiles from a critical-dimension scanning electron microscope (CD-SEM) image, called as "T-sigma method", is proposed and evaluated. Without preparing a library of database in advance, T-sigma can estimate a feature of a pattern sidewall. Furthermore, it supplies the optimum edge-definition (i.e., threshold level for determining edge position from a CDSEM signal) to detect the top and bottom of the pattern. This method consists of three steps. First, two components of line-edge roughness (LER); noise-induced bias (i.e., LER bias) and unbiased component (i.e., bias-free LER) are calculated with set threshold level. Second, these components are calculated with various threshold values, and the threshold-dependence of these two components, "T-sigma graph", is obtained. Finally, the optimum threshold value for the top and the bottom edge detection are given by the analysis of T-sigma graph. T-sigma was applied to CD-SEM images of three kinds of resist-pattern samples. In addition, reference metrology was performed with atomic force microscope (AFM) and scanning transmission electron microscope (STEM). Sensitivity of CD measured by T-sigma to the reference CD was higher than or equal to that measured by the conventional edge definition. Regarding the absolute measurement accuracy, T-sigma showed better results than the conventional definition. Furthermore, T-sigma graphs were calculated from CD-SEM images of two kinds of resist samples and compared with corresponding STEM observation results. Both bias-free LER and LER bias increased as the detected edge point moved from the bottom to the top of the pattern in the case that the pattern had a straight sidewall and a round top. On the other hand, they were almost constant in the case that the pattern had a re-entrant profile. T-sigma will be able to reveal a re-entrant feature. From these results, it is found that T-sigma method can provide rough cross-sectional pattern features and achieve quick, easy and accurate measurements of top and bottom CD.

  5. Overcoming etch challenges related to EUV based patterning (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Metz, Andrew W.; Cottle, Hongyun; Honda, Masanobu; Morikita, Shinya; Kumar, Kaushik A.; Biolsi, Peter

    2017-04-01

    Research and development activities related to Extreme Ultra Violet [EUV] defined patterning continue to grow for < 40 nm pitch applications. The confluence of high cost and extreme process control challenges of Self-Aligned Quad Patterning [SAQP] with continued momentum for EUV ecosystem readiness could provide cost advantages in addition to improved intra-level overlay performance relative to multiple patterning approaches. However, Line Edge Roughness [LER] and Line Width Roughness [LWR] performance of EUV defined resist images are still far from meeting technology needs or ITRS spec performance. Furthermore, extreme resist height scaling to mitigate flop over exacerbates the plasma etch trade-offs related to traditional approaches of PR smoothing, descum implementation and maintaining 2D aspect ratios of short lines or elliptical contacts concurrent with ultra-high photo resist [PR] selectivity. In this paper we will discuss sources of LER/LWR, impact of material choice, integration, and innovative plasma process techniques and describe how TELTM VigusTM CCP Etchers can enhance PR selectivity, reduce LER/LWR, and maintain 2D aspect ratio of incoming patterns. Beyond traditional process approaches this paper will show the utility of: [1] DC Superposition in enhancing EUV resist hardening and selectivity, increasing resistance to stress induced PR line wiggle caused by CFx passivation, and mitigating organic planarizer wiggle; [2] Quasi Atomic Layer Etch [Q-ALE] for ARC open eliminating the tradeoffs between selectivity, CD, and shrink ratio control; and [3] ALD+Etch FUSION technology for feature independent CD shrink and LER reduction. Applicability of these concepts back transferred to 193i based lithography is also confirmed.

  6. [Integral indices of peripheral blood leukogram in the estimation of non-specific immunological reactivity in patients with ischemic heart disease].

    PubMed

    Zhukhorov, L S; Voronaia, Iu L

    2002-12-01

    With the help of differential blood count analysis and velocity of erythrocyte sedimentation (VES), 30 healthy persons (donors), 30 patients with chronic ischemic hearty disease (IHD) and 34 patients with acute myocardial infarction (AMI) underwent the procedure of calculation for leukocyte index (LI), leukocyte intoxication index (LII), leukocyte shift index (LSI), leukocyte and VES ratio (LVESR), leukocytic and granulocytic index (LGI), general index (GI), neurophil-lymphocyte ratio (NLR), neurophil-monocyte ratio (NMR), lymphocyte-monocyte ratio (LMR) and lymphocyte-eosinophil ratio (LER). Unlike healthy people, patients with chronic IHD had higher indices of LVESR, GI and LER while patients with AMI had increasing indices of LII, LSI, NLR, LER and decreasing indices of LI, LGI, GI, LMI. In case of AMI compared with chronic IHD, average indices of LII, LSI, NLR were higher and indices of LI, LGI, LVESR, GI, LMR were lower. The obtained results show expansion of possibilities to get information about the state of non-specific immunologic reactivity in patients with various IHD forms with the help of integral indices in blood leukogram.

  7. The Error in Total Error Reduction

    PubMed Central

    Witnauer, James E.; Urcelay, Gonzalo P.; Miller, Ralph R.

    2013-01-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modelling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. PMID:23891930

  8. Using tevatron magnets for HE-LHC or new ring in LHC tunnel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarz, Henryk; /Fermilab

    Two injector accelerator options for HE-LHC of p{sup +} - p{sup +} collisions at 33 TeV cms energy are briefly outlined. One option is based on the Super-SPS (S-SPS) accelerator in the SPS tunnel, and the other one is based on the LER (Low-Energy-Ring) accelerator in the LHC tunnel. Expectations of performance of the main arc accelerator magnets considered for the construction of the S-SPS and of the LER accelerators are used to tentatively devise some selected properties of these accelerators as potential injectors to HE-LHC.

  9. Performance characterization of a combined material identification and screening algorithm

    NASA Astrophysics Data System (ADS)

    Green, Robert L.; Hargreaves, Michael D.; Gardner, Craig M.

    2013-05-01

    Portable analytical devices based on a gamut of technologies (Infrared, Raman, X-Ray Fluorescence, Mass Spectrometry, etc.) are now widely available. These tools have seen increasing adoption for field-based assessment by diverse users including military, emergency response, and law enforcement. Frequently, end-users of portable devices are non-scientists who rely on embedded software and the associated algorithms to convert collected data into actionable information. Two classes of problems commonly encountered in field applications are identification and screening. Identification algorithms are designed to scour a library of known materials and determine whether the unknown measurement is consistent with a stored response (or combination of stored responses). Such algorithms can be used to identify a material from many thousands of possible candidates. Screening algorithms evaluate whether at least a subset of features in an unknown measurement correspond to one or more specific substances of interest and are typically configured to detect from a small list potential target analytes. Thus, screening algorithms are much less broadly applicable than identification algorithms; however, they typically provide higher detection rates which makes them attractive for specific applications such as chemical warfare agent or narcotics detection. This paper will present an overview and performance characterization of a combined identification/screening algorithm that has recently been developed. It will be shown that the combined algorithm provides enhanced detection capability more typical of screening algorithms while maintaining a broad identification capability. Additionally, we will highlight how this approach can enable users to incorporate situational awareness during a response.

  10. Inheritance of Trans Chromosomal Methylation patterns from Arabidopsis F1 hybrids

    PubMed Central

    Greaves, Ian K.; Groszmann, Michael; Wang, Aihua; Peacock, W. James; Dennis, Elizabeth S.

    2014-01-01

    Hybridization in plants leads to transinteractions between the parental genomes and epigenomes that can result in changes to both 24 nt siRNA and cytosine methylation (mC) levels in the hybrid. In Arabidopsis the principle processes altering the hybrid methylome are Trans Chromosomal Methylation (TCM) and Trans Chromosomal deMethylation (TCdM) in which the mC pattern of a genomic segment attains the same mC pattern of the corresponding segment on the other parental chromosome. We examined two loci that undergo TCM/TCdM in the Arabidopsis C24/Landsberg erecta (Ler) F1 hybrids, which show patterns of inheritance dependent on the properties of the particular donor and recipient chromosomal segments. At At1g64790 the TCM- and TCdM-derived mC patterns are maintained in the F2 generation but are transmitted in outcrosses or backcrosses only by the C24 genomic segment. At a region between and adjacent to At3g43340 and At3g43350, the originally unmethylated Ler genomic segment receives the C24 mC pattern in the F1, which is then maintained in backcross plants independent of the presence of the parental C24 segment. In backcrosses to an unmethylated Ler allele, the newly methylated F1 Ler segment may act as a TCM source in a process comparable to paramutation in maize. TCM-derived mC patterns are associated with reduced expression of both At3g43340 and At3g43350 in F1 and F2 plants, providing support for such events influencing the transcriptome. The inheritance of the F1 mC patterns and the segregation of other genetic and epigenetic determinants may contribute to the reduced hybrid vigor in the F2 and subsequent generations. PMID:24449910

  11. Inheritance of Trans Chromosomal Methylation patterns from Arabidopsis F1 hybrids.

    PubMed

    Greaves, Ian K; Groszmann, Michael; Wang, Aihua; Peacock, W James; Dennis, Elizabeth S

    2014-02-04

    Hybridization in plants leads to transinteractions between the parental genomes and epigenomes that can result in changes to both 24 nt siRNA and cytosine methylation ((m)C) levels in the hybrid. In Arabidopsis the principle processes altering the hybrid methylome are Trans Chromosomal Methylation (TCM) and Trans Chromosomal deMethylation (TCdM) in which the (m)C pattern of a genomic segment attains the same (m)C pattern of the corresponding segment on the other parental chromosome. We examined two loci that undergo TCM/TCdM in the Arabidopsis C24/Landsberg erecta (Ler) F1 hybrids, which show patterns of inheritance dependent on the properties of the particular donor and recipient chromosomal segments. At At1g64790 the TCM- and TCdM-derived (m)C patterns are maintained in the F2 generation but are transmitted in outcrosses or backcrosses only by the C24 genomic segment. At a region between and adjacent to At3g43340 and At3g43350, the originally unmethylated Ler genomic segment receives the C24 (m)C pattern in the F1, which is then maintained in backcross plants independent of the presence of the parental C24 segment. In backcrosses to an unmethylated Ler allele, the newly methylated F1 Ler segment may act as a TCM source in a process comparable to paramutation in maize. TCM-derived (m)C patterns are associated with reduced expression of both At3g43340 and At3g43350 in F1 and F2 plants, providing support for such events influencing the transcriptome. The inheritance of the F1 (m)C patterns and the segregation of other genetic and epigenetic determinants may contribute to the reduced hybrid vigor in the F2 and subsequent generations.

  12. A Hydraulic Model Is Compatible with Rapid Changes in Leaf Elongation under Fluctuating Evaporative Demand and Soil Water Status1[C][W][OPEN

    PubMed Central

    Caldeira, Cecilio F.; Bosio, Mickael; Parent, Boris; Jeanguenin, Linda; Chaumont, François; Tardieu, François

    2014-01-01

    Plants are constantly facing rapid changes in evaporative demand and soil water content, which affect their water status and growth. In apparent contradiction to a hydraulic hypothesis, leaf elongation rate (LER) declined in the morning and recovered upon soil rehydration considerably quicker than transpiration rate and leaf water potential (typical half-times of 30 min versus 1–2 h). The morning decline of LER began at very low light and transpiration and closely followed the stomatal opening of leaves receiving direct light, which represent a small fraction of leaf area. A simulation model in maize (Zea mays) suggests that these findings are still compatible with a hydraulic hypothesis. The small water flux linked to stomatal aperture would be sufficient to decrease water potentials of the xylem and growing tissues, thereby causing a rapid decline of simulated LER, while the simulated water potential of mature tissues declines more slowly due to a high hydraulic capacitance. The model also captured growth patterns in the evening or upon soil rehydration. Changes in plant hydraulic conductance partly counteracted those of transpiration. Root hydraulic conductivity increased continuously in the morning, consistent with the transcript abundance of Zea maize Plasma Membrane Intrinsic Protein aquaporins. Transgenic lines underproducing abscisic acid, with lower hydraulic conductivity and higher stomatal conductance, had a LER declining more rapidly than wild-type plants. Whole-genome transcriptome and phosphoproteome analyses suggested that the hydraulic processes proposed here might be associated with other rapidly occurring mechanisms. Overall, the mechanisms and model presented here may be an essential component of drought tolerance in naturally fluctuating evaporative demand and soil moisture. PMID:24420931

  13. Resist image quality control via acid diffusion constant and/or photodecomposable quencher concentration in the fabrication of 11 nm half-pitch line-and-space patterns using extreme-ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2018-05-01

    Extreme-ultraviolet (EUV) lithography will be applied to the high-volume production of semiconductor devices with 16 nm half-pitch resolution and is expected to be extended to that of devices with 11 nm half-pitch resolution. With the reduction in the feature sizes, the control of acid diffusion becomes a significant concern. In this study, the dependence of resist image quality on T PEB D acid and photodecomposable quencher concentration was investigated by the Monte Carlo method on the basis of the sensitization and reaction mechanisms of chemically amplified EUV resists. Here, T PEB and D acid are the postexposure baking (PEB) time and the acid diffusion constant, respectively. The resist image quality of 11 nm line-and-space patterns is discussed in terms of line edge roughness (LER) and stochastic defect generation. For the minimization of LER, it is necessary to design and control not only the photodecomposable quencher concentration but also T PEB D acid. In this case, D acid should be adjusted to be 0.3–1.5 nm2 s‑1 for a PEB time of 60 s with optimization of the balance among LER and stochastic pinching and bridging. Even if it is difficult to decrease D acid to the range of 0.3–1.5 nm2 s‑1, the image quality can still be controlled via only the photodecomposable quencher concentration, although LER and stochastic pinching and bridging are slightly increased. In this case, accurate control of the photodecomposable quencher concentration and the reduction in the initial standard deviation of the number of protected units are required.

  14. Use of a glucose management service improves glycemic control following vascular surgery: an interrupted time-series study.

    PubMed

    Wallaert, Jessica B; Chaidarun, Sushela S; Basta, Danielle; King, Kathryn; Comi, Richard; Ogrinc, Greg; Nolan, Brian W; Goodney, Philip P

    2015-05-01

    The optimal method for obtaining good blood glucose control in noncritically ill patients undergoing peripheral vascular surgery remains a topic of debate for surgeons, endocrinologists, and others involved in the care of patients with peripheral arterial disease and diabetes. A prospective trial was performed to evaluate the impact of routine use of a glucose management service (GMS) on glycemic control within 24 hours of lower-extremity revascularization (LER). In an interrupted time-series design (May 1, 2011-April 30, 2012), surgeon-directed diabetic care (Baseline phase) to routine GMS involvement (Intervention phase) was compared following LER. GMS assumed responsibility for glucose management through discharge. The main outcome measure was glycemic control, assessed by (1) mean hospitalization glucose and (2) the percentage of recorded glucose values within target range. Statistical process control charts were used to assess the impact of the intervention. Clinically important differences in patient demographics were noted between groups; the 19 patients in the Intervention arm had worse peripheral vascular disease than the 19 patients in the Baseline arm (74% critical limb ischemia versus 58%; p = .63). Routine use of GMS significantly reduced mean hospitalization glucose (191 mg/dL Baseline versus 150 mg/dL Intervention, p < .001). Further, the proportion of glucose values in target range increased (48% Baseline versus 78% Intervention, p = .05). Following removal of GMS involvement, measures of glycemic control did not significantly decrease for the 19 postintervention patients. Routine involvement of GMS improved glycemic control in patients undergoing LER. Future work is needed to examine the impact of improved glycemic control on clinical outcomes following LER.

  15. School-Based Screening for Suicide Risk: Balancing Costs and Benefits

    PubMed Central

    Wilcox, Holly; Huo, Yanling; Turner, J. Blake; Fisher, Prudence; Shaffer, David

    2010-01-01

    Objectives. We examined the effects of a scoring algorithm change on the burden and sensitivity of a screen for adolescent suicide risk. Methods. The Columbia Suicide Screen was used to screen 641 high school students for high suicide risk (recent ideation or lifetime attempt and depression, or anxiety, or substance use), determined by subsequent blind assessment with the Diagnostic Interview Schedule for Children. We compared the accuracy of different screen algorithms in identifying high-risk cases. Results. A screen algorithm comprising recent ideation or lifetime attempt or depression, anxiety, or substance-use problems set at moderate-severity level classed 35% of students as positive and identified 96% of high-risk students. Increasing the algorithm's threshold reduced the proportion identified to 24% and identified 92% of high-risk cases. Asking only about recent suicidal ideation or lifetime suicide attempt identified 17% of the students and 89% of high-risk cases. The proportion of nonsuicidal diagnosis–bearing students found with the 3 algorithms was 62%, 34%, and 12%, respectively. Conclusions. The Columbia Suicide Screen threshold can be altered to reduce the screen-positive population, saving costs and time while identifying almost all students at high risk for suicide. PMID:20634467

  16. Evaluating the utility of syndromic surveillance algorithms for screening to detect potentially clonal hospital infection outbreaks

    PubMed Central

    Talbot, Thomas R; Schaffner, William; Bloch, Karen C; Daniels, Titus L; Miller, Randolph A

    2011-01-01

    Objective The authors evaluated algorithms commonly used in syndromic surveillance for use as screening tools to detect potentially clonal outbreaks for review by infection control practitioners. Design Study phase 1 applied four aberrancy detection algorithms (CUSUM, EWMA, space-time scan statistic, and WSARE) to retrospective microbiologic culture data, producing a list of past candidate outbreak clusters. In phase 2, four infectious disease physicians categorized the phase 1 algorithm-identified clusters to ascertain algorithm performance. In phase 3, project members combined the algorithms to create a unified screening system and conducted a retrospective pilot evaluation. Measurements The study calculated recall and precision for each algorithm, and created precision-recall curves for various methods of combining the algorithms into a unified screening tool. Results Individual algorithm recall and precision ranged from 0.21 to 0.31 and from 0.053 to 0.29, respectively. Few candidate outbreak clusters were identified by more than one algorithm. The best method of combining the algorithms yielded an area under the precision-recall curve of 0.553. The phase 3 combined system detected all infection control-confirmed outbreaks during the retrospective evaluation period. Limitations Lack of phase 2 reviewers' agreement indicates that subjective expert review was an imperfect gold standard. Less conservative filtering of culture results and alternate parameter selection for each algorithm might have improved algorithm performance. Conclusion Hospital outbreak detection presents different challenges than traditional syndromic surveillance. Nevertheless, algorithms developed for syndromic surveillance have potential to form the basis of a combined system that might perform clinically useful hospital outbreak screening. PMID:21606134

  17. The West Midlands breast cancer screening status algorithm - methodology and use as an audit tool.

    PubMed

    Lawrence, Gill; Kearins, Olive; O'Sullivan, Emma; Tappenden, Nancy; Wallis, Matthew; Walton, Jackie

    2005-01-01

    To illustrate the ability of the West Midlands breast screening status algorithm to assign a screening status to women with malignant breast cancer, and its uses as a quality assurance and audit tool. Breast cancers diagnosed between the introduction of the National Health Service [NHS] Breast Screening Programme and 31 March 2001 were obtained from the West Midlands Cancer Intelligence Unit (WMCIU). Screen-detected tumours were identified via breast screening units, and the remaining cancers were assigned to one of eight screening status categories. Multiple primaries and recurrences were excluded. A screening status was assigned to 14,680 women (96% of the cohort examined), 110 cancers were not registered at the WMCIU and the cohort included 120 screen-detected recurrences. The West Midlands breast screening status algorithm is a robust simple tool which can be used to derive data to evaluate the efficacy and impact of the NHS Breast Screening Programme.

  18. The error in total error reduction.

    PubMed

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. ERECTA, salicylic acid, abscisic acid, and jasmonic acid modulate quantitative disease resistance of Arabidopsis thaliana to Verticillium longisporum.

    PubMed

    Häffner, Eva; Karlovsky, Petr; Splivallo, Richard; Traczewska, Anna; Diederichsen, Elke

    2014-04-01

    Verticillium longisporum is a soil-borne vascular pathogen infecting cruciferous hosts such as oilseed rape. Quantitative disease resistance (QDR) is the major control means, but its molecular basis is poorly understood so far. Quantitative trait locus (QTL) mapping was performed using a new (Bur×Ler) recombinant inbred line (RIL) population of Arabidopsis thaliana. Phytohormone measurements and analyses in defined mutants and near-isogenic lines (NILs) were used to identify genes and signalling pathways that underlie different resistance QTL. QTL for resistance to V. longisporum-induced stunting, systemic colonization by the fungus and for V. longisporum-induced chlorosis were identified. Stunting resistance QTL were contributed by both parents. The strongest stunting resistance QTL was shown to be identical with Erecta. A functional Erecta pathway, which was present in Bur, conferred partial resistance to V. longisporum-induced stunting. Bur showed severe stunting susceptibility in winter. Three stunting resistance QTL of Ler origin, two co-localising with wall-associated kinase-like (Wakl)-genes, were detected in winter. Furthermore, Bur showed a much stronger induction of salicylic acid (SA) by V. longisporum than Ler. Systemic colonization was controlled independently of stunting. The vec1 QTL on chromosome 2 had the strongest effect on systemic colonization. The same chromosomal region controlled the level of abscisic acid (ABA) and jasmonic acid (JA) in response to V. longisporum: The level of ABA was higher in colonization-susceptible Ler than in colonization-resistant Bur after V. longisporum infection. JA was down-regulated in Bur after infection, but not in Ler. These differences were also demonstrated in NILs, varying only in the region containing vec1. All phytohormone responses were shown to be independent of Erecta. Signalling systems with a hitherto unknown role in the QDR of A. thaliana against V. longisporum were identified: Erecta mediated resistance against V. longisporum-induced stunting. Independent of Erecta, stunting was caused in a light-dependent manner with possible participation of SA and Wakl genes. ABA and JA showed a genotype-specific response that corresponded with systemic colonization by the fungus. Understanding the biological basis of phenotypic variation in A. thaliana with respect to V. longisporum resistance will provide new approaches for implementing durable resistance in cruciferous crops.

  20. Clinical evaluation of the vector algorithm for neonatal hearing screening using automated auditory brainstem response.

    PubMed

    Keohane, Bernie M; Mason, Steve M; Baguley, David M

    2004-02-01

    A novel auditory brainstem response (ABR) detection and scoring algorithm, entitled the Vector algorithm is described. An independent clinical evaluation of the algorithm using 464 tests (120 non-stimulated and 344 stimulated tests) on 60 infants, with a mean age of approximately 6.5 weeks, estimated test sensitivity greater than 0.99 and test specificity at 0.87 for one test. Specificity was estimated to be greater than 0.95 for a two stage screen. Test times were of the order of 1.5 minutes per ear for detection of an ABR and 4.5 minutes per ear in the absence of a clear response. The Vector algorithm is commercially available for both automated screening and threshold estimation in hearing screening devices.

  1. Comparison of Traditional and Reverse Syphilis Screening Algorithms in Medical Health Checkups.

    PubMed

    Nah, Eun Hee; Cho, Seon; Kim, Suyoung; Cho, Han Ik; Chai, Jong Yil

    2017-11-01

    The syphilis diagnostic algorithms applied in different countries vary significantly depending on the local syphilis epidemiology and other considerations, including the expected workload, the need for automation in the laboratory and budget factors. This study was performed to investigate the efficacy of traditional and reverse syphilis diagnostic algorithms during general health checkups. In total, 1,000 blood specimens were obtained from 908 men and 92 women during their regular health checkups. Traditional screening and reverse screening were applied to the same specimens using automatic rapid plasma regain (RPR) and Treponema pallidum latex agglutination (TPLA) tests, respectively. Specimens that were reverse algorithm (TPLA) reactive, were subjected to a second treponemal test performed by using the chemiluminescent microparticle immunoassay (CMIA). Of the 1,000 specimens tested, 68 (6.8%) were reactive by reverse screening (TPLA) compared with 11 (1.1%) by traditional screening (RPR). The traditional algorithm failed to detect 48 specimens [TPLA(+)/RPR(-)/CMIA(+)]. The median TPLA cutoff index (COI) was higher in CMIA-reactive cases than in CMIA-nonreactive cases (90.5 vs 12.5 U). The reverse screening algorithm could detect the subjects with possible latent syphilis who were not detected by the traditional algorithm. Those individuals could be provided with opportunities for evaluating syphilis during their health checkups. The COI values of the initial TPLA test may be helpful in excluding false-positive TPLA test results in the reverse algorithm. © The Korean Society for Laboratory Medicine

  2. Immigrant screening for latent tuberculosis in Norway: a cost-effectiveness analysis.

    PubMed

    Haukaas, Fredrik Salvesen; Arnesen, Trude Margrete; Winje, Brita Askeland; Aas, Eline

    2017-05-01

    The incidence of tuberculosis (TB) disease has increased in Norway since the mid-1990s. Immigrants are screened, and some are treated, for latent TB infection (LTBI) to prevent TB disease (reactivation). In this study, we estimated the costs of both treating and screening for LTBI and TB disease, which has not been done previously in Norway. We developed a model to indicate the cost-effectiveness of four different screening algorithms for LTBI using avoided TB disease cases as the health outcome. Further, we calculated the expected value of perfect information (EVPI), and indicated areas of LTBI screening that could be changed to improve cost-effectiveness. The costs of treating LTBI and TB disease were estimated to be €1938 and €15,489 per case, respectively. The model evaluates four algorithms, and suggests three cost-effective algorithms depending on the cost-effectiveness threshold. Screening all immigrants with interferon-gamma release assays (IGRA) requires the highest threshold (€28,400), followed by the algorithms "IGRA on immigrants with risk factors" and "no LTBI screening." EVPI is approximately €5 per screened immigrant. The costs for a cohort of 20,000 immigrants followed through 10 years range from €12.2 million for the algorithm "screening and treatment for TB disease but no LTBI screening," to €14 million for "screening all immigrants for both TB disease and LTBI with IGRA." The results suggest that the cost of TB disease screening and treatment is the largest contributor to total costs, while LTBI screening and treatment costs are relatively small. Increasing the proportion of IGRA-positive immigrants who are treated decreases the costs per avoided case substantially.

  3. Confidence Intervals for System Reliability and Availability of Maintained Systems Using Monte Carlo Techniques

    DTIC Science & Technology

    1981-12-01

    DTIC _JUN ,I 51982 UNITED STATES AIR FORCE AIR UNIVERSITY E AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air-force Base,Ohio S 2 B 14 Best...t’re Air F:or- e -ns"it’.,, e of Technclogy Air Uv-ýerz.tyj in Partial 𔄁ulfilIThent Reýquirements fol- ,-hth D,ýýr.e c4" MastLer of’ OperaZ-ins...iesearc- VeTA 3 MohamedO ’’’’Jo SpD’ Fas.abal-la Lt. C ol. Egyplt.’.an Army Gradua~’p ( ler ons Research December 1981 Approcved fL~r pu>ý’ rclea.se

  4. Evaluation of the CDC proposed laboratory HIV testing algorithm among men who have sex with men (MSM) from five US metropolitan statistical areas using specimens collected in 2011.

    PubMed

    Masciotra, Silvina; Smith, Amanda J; Youngpairoj, Ae S; Sprinkle, Patrick; Miles, Isa; Sionean, Catlainn; Paz-Bailey, Gabriela; Johnson, Jeffrey A; Owen, S Michele

    2013-12-01

    Until recently most testing algorithms in the United States (US) utilized Western blot (WB) as the supplemental test. CDC has proposed an algorithm for HIV diagnosis which includes an initial screen with a Combo Antigen/Antibody 4th generation-immunoassay (IA), followed by an HIV-1/2 discriminatory IA of initially reactive-IA specimens. Discordant results in the proposed algorithm are resolved by nucleic acid-amplification testing (NAAT). Evaluate the results obtained with the CDC proposed laboratory-based algorithm using specimens from men who have sex with men (MSM) obtained in five metropolitan statistical areas (MSAs). Specimens from 992 MSM from five MSAs participating in the CDC's National HIV Behavioral Surveillance System in 2011 were tested at local facilities and CDC. The five MSAs utilized algorithms of various screening assays and specimen types, and WB as the supplemental test. At the CDC, serum/plasma specimens were screened with 4th generation-IA and the Multispot HIV-1/HIV-2 discriminatory assay was used as the supplemental test. NAAT was used to resolve discordant results and to further identify acute HIV infections from all screened-non-reactive missed by the proposed algorithm. Performance of the proposed algorithm was compared to site-specific WB-based algorithms. The proposed algorithm detected 254 infections. The WB-based algorithms detected 19 fewer infections; 4 by oral fluid (OF) rapid testing and 15 by WB supplemental testing (12 OF and 3 blood). One acute infection was identified by NAAT from all screened-non-reactive specimens. The proposed algorithm identified more infections than the WB-based algorithms in a high-risk MSM population. OF testing was associated with most of the discordant results between algorithms. HIV testing with the proposed algorithm can increase diagnosis of infected individuals, including early infections. Published by Elsevier B.V.

  5. Performance Characteristics of the Reverse Syphilis Screening Algorithm in a Population With a Moderately High Prevalence of Syphilis.

    PubMed

    Rourk, Angela R; Nolte, Frederick S; Litwin, Christine M

    2016-11-01

    With the recent introduction of automated treponemal tests, a new reverse syphilis algorithm has been proposed and now used by many clinical laboratories. We analyzed the impact of instituting the reverse screening syphilis algorithm in a laboratory that serves a geographic area with a moderately high prevalence of syphilis infection. Serum samples sent for syphilis testing were tested using a treponemal enzyme immunoassay (EIA) as the screening assay. EIA reactive samples were tested by rapid plasma reagin (RPR) and titered to end point if reactive. RPR nonreactive samples were analyzed by the Treponema pallidum particle agglutination test (TP-PA). Pertinent medical records were reviewed for false-reactive screens and samples with evidence of past syphilis infection. Among 10,060 patients tested, 502 (5%) were reactive on the initial EIA screen. The RPR was reactive in 150 (1.5%). TP-PA testing determined that 103 (1.0%) were falsely reactive on initial EIA screen. The reverse screening algorithm, however, identified 242 (2.4%) with evidence of latent, secondary, or past syphilis, 21 of whom had no or unknown prior treatment with antibiotics. Despite a 1.0% false-reactive rate, the reverse syphilis algorithm detected 21 patients with possible latent syphilis that may have gone undetected by traditional syphilis screening. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Effect of chemoepitaxial guiding underlayer design on the pattern quality and shape of aligned lamellae for fabrication of line-space patterns

    NASA Astrophysics Data System (ADS)

    Nation, Benjamin D.; Peters, Andrew J.; Lawson, Richard A.; Ludovice, Peter J.; Henderson, Clifford L.

    2017-10-01

    Chemoepitaxial guidance of block copolymer directed self-assembly in thin films is explored using a coarse-grained molecular dynamics model. The underlayers studied are 2× density multiplying line-space patterns composed of repeating highly preferential pinning stripes of various widths separated by larger, more neutral, background regions of various compositions. Decreasing the pinning stripe width or making the background region more neutral is found to increase the line edge roughness (LER) of the lines, but these conditions are found to give the straightest sidewalls for the formed lines. Varying these underlayer properties is found to have minimal effect on linewidth roughness. A larger pinning stripe causes the pinned line (PL) to foot (expand near the substrate), and a preferential background region causes the unpinned line (UPL) to undercut (contract near the substrate). A simple model was developed to predict the optimal conditions to eliminate footing. Using this model, conditions are found that decrease footing of the PL, but these conditions increase undercutting in the UPL. Deformations in either the PL or UPL propagate to the other line. There exists a trade-off between LER and the footing/undercutting, that is, decreasing LER increases footing/undercutting and vice versa.

  7. Costs per Diagnosis of Acute HIV Infection in Community-based Screening Strategies: A Comparative Analysis of Four Screening Algorithms

    PubMed Central

    Hoenigl, Martin; Graff-Zivin, Joshua; Little, Susan J.

    2016-01-01

    Background. In nonhealthcare settings, widespread screening for acute human immunodeficiency virus (HIV) infection (AHI) is limited by cost and decision algorithms to better prioritize use of resources. Comparative cost analyses for available strategies are lacking. Methods. To determine cost-effectiveness of community-based testing strategies, we evaluated annual costs of 3 algorithms that detect AHI based on HIV nucleic acid amplification testing (EarlyTest algorithm) or on HIV p24 antigen (Ag) detection via Architect (Architect algorithm) or Determine (Determine algorithm) as well as 1 algorithm that relies on HIV antibody testing alone (Antibody algorithm). The cost model used data on men who have sex with men (MSM) undergoing community-based AHI screening in San Diego, California. Incremental cost-effectiveness ratios (ICERs) per diagnosis of AHI were calculated for programs with HIV prevalence rates between 0.1% and 2.9%. Results. Among MSM in San Diego, EarlyTest was cost-savings (ie, ICERs per AHI diagnosis less than $13.000) when compared with the 3 other algorithms. Cost analyses relative to regional HIV prevalence showed that EarlyTest was cost-effective (ie, ICERs less than $69.547) for similar populations of MSM with an HIV prevalence rate >0.4%; Architect was the second best alternative for HIV prevalence rates >0.6%. Conclusions. Identification of AHI by the dual EarlyTest screening algorithm is likely to be cost-effective not only among at-risk MSM in San Diego but also among similar populations of MSM with HIV prevalence rates >0.4%. PMID:26508512

  8. Adaptive Gaussian mixture models for pre-screening in GPR data

    NASA Astrophysics Data System (ADS)

    Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.

    2011-06-01

    Due to the large amount of data generated by vehicle-mounted ground penetrating radar (GPR) antennae arrays, advanced feature extraction and classification can only be performed on a small subset of data during real-time operation. As a result, most GPR based landmine detection systems implement "pre-screening" algorithms to processes all of the data generated by the antennae array and identify locations with anomalous signatures for more advanced processing. These pre-screening algorithms must be computationally efficient and obtain high probability of detection, but can permit a false alarm rate which might be higher than the total system requirements. Many approaches to prescreening have previously been proposed, including linear prediction coefficients, the LMS algorithm, and CFAR-based approaches. Similar pre-screening techniques have also been developed in the field of video processing to identify anomalous behavior or anomalous objects. One such algorithm, an online k-means approximation to an adaptive Gaussian mixture model (GMM), is particularly well-suited to application for pre-screening in GPR data due to its computational efficiency, non-linear nature, and relevance of the logic underlying the algorithm to GPR processing. In this work we explore the application of an adaptive GMM-based approach for anomaly detection from the video processing literature to pre-screening in GPR data. Results with the ARA Nemesis landmine detection system demonstrate significant pre-screening performance improvements compared to alternative approaches, and indicate that the proposed algorithm is a complimentary technique to existing methods.

  9. Genetic and Developmental Basis for Increased Leaf Thickness in the Arabidopsis Cvi Ecotype.

    PubMed

    Coneva, Viktoriya; Chitwood, Daniel H

    2018-01-01

    Leaf thickness is a quantitative trait that is associated with the ability of plants to occupy dry, high irradiance environments. Despite its importance, leaf thickness has been difficult to measure reproducibly, which has impeded progress in understanding its genetic basis, and the associated anatomical mechanisms that pattern it. Here, we used a custom-built dual confocal profilometer device to measure leaf thickness in the Arabidopsis Ler × Cvi recombinant inbred line population and found statistical support for four quantitative trait loci (QTL) associated with this trait. We used publically available data for a suite of traits relating to flowering time and growth responses to light quality and show that three of the four leaf thickness QTL coincide with QTL for at least one of these traits. Using time course photography, we quantified the relative growth rate and the pace of rosette leaf initiation in the Ler and Cvi ecotypes. We found that Cvi rosettes grow slower than Ler, both in terms of the rate of leaf initiation and the overall rate of biomass accumulation. Collectively, these data suggest that leaf thickness is tightly linked with physiological status and may present a tradeoff between the ability to withstand stress and rapid vegetative growth. To understand the anatomical basis of leaf thickness, we compared cross-sections of Cvi and Ler leaves and show that Cvi palisade mesophyll cells elongate anisotropically contributing to leaf thickness. Flow cytometry of whole leaves show that endopolyploidy accompanies thicker leaves in Cvi. Overall, our data suggest that mechanistically, an altered schedule of cellular events affecting endopolyploidy and increasing palisade mesophyll cell length contribute to increase of leaf thickness in Cvi. Ultimately, knowledge of the genetic basis and developmental trajectory leaf thickness will inform the mechanisms by which natural selection acts to produce variation in this adaptive trait.

  10. Genetic and Developmental Basis for Increased Leaf Thickness in the Arabidopsis Cvi Ecotype

    PubMed Central

    Coneva, Viktoriya; Chitwood, Daniel H.

    2018-01-01

    Leaf thickness is a quantitative trait that is associated with the ability of plants to occupy dry, high irradiance environments. Despite its importance, leaf thickness has been difficult to measure reproducibly, which has impeded progress in understanding its genetic basis, and the associated anatomical mechanisms that pattern it. Here, we used a custom-built dual confocal profilometer device to measure leaf thickness in the Arabidopsis Ler × Cvi recombinant inbred line population and found statistical support for four quantitative trait loci (QTL) associated with this trait. We used publically available data for a suite of traits relating to flowering time and growth responses to light quality and show that three of the four leaf thickness QTL coincide with QTL for at least one of these traits. Using time course photography, we quantified the relative growth rate and the pace of rosette leaf initiation in the Ler and Cvi ecotypes. We found that Cvi rosettes grow slower than Ler, both in terms of the rate of leaf initiation and the overall rate of biomass accumulation. Collectively, these data suggest that leaf thickness is tightly linked with physiological status and may present a tradeoff between the ability to withstand stress and rapid vegetative growth. To understand the anatomical basis of leaf thickness, we compared cross-sections of Cvi and Ler leaves and show that Cvi palisade mesophyll cells elongate anisotropically contributing to leaf thickness. Flow cytometry of whole leaves show that endopolyploidy accompanies thicker leaves in Cvi. Overall, our data suggest that mechanistically, an altered schedule of cellular events affecting endopolyploidy and increasing palisade mesophyll cell length contribute to increase of leaf thickness in Cvi. Ultimately, knowledge of the genetic basis and developmental trajectory leaf thickness will inform the mechanisms by which natural selection acts to produce variation in this adaptive trait. PMID:29593772

  11. Allelic differences in a vacuolar invertase affect Arabidopsis growth at early plant development.

    PubMed

    Leskow, Carla Coluccio; Kamenetzky, Laura; Dominguez, Pia Guadalupe; Díaz Zirpolo, José Antonio; Obata, Toshihiro; Costa, Hernán; Martí, Marcelo; Taboga, Oscar; Keurentjes, Joost; Sulpice, Ronan; Ishihara, Hirofumi; Stitt, Mark; Fernie, Alisdair Robert; Carrari, Fernando

    2016-07-01

    Improving carbon fixation in order to enhance crop yield is a major goal in plant sciences. By quantitative trait locus (QTL) mapping, it has been demonstrated that a vacuolar invertase (vac-Inv) plays a key role in determining the radical length in Arabidopsis. In this model, variation in vac-Inv activity was detected in a near isogenic line (NIL) population derived from a cross between two divergent accessions: Landsberg erecta (Ler) and Cape Verde Island (CVI), with the CVI allele conferring both higher Inv activity and longer radicles. The aim of the current work is to understand the mechanism(s) underlying this QTL by analyzing structural and functional differences of vac-Inv from both accessions. Relative transcript abundance analyzed by quantitative real-time PCR (qRT-PCR) showed similar expression patterns in both accessions; however, DNA sequence analyses revealed several polymorphisms that lead to changes in the corresponding protein sequence. Moreover, activity assays revealed higher vac-Inv activity in genotypes carrying the CVI allele than in those carrying the Ler allele. Analyses of purified recombinant proteins showed a similar K m for both alleles and a slightly higher V max for that of Ler. Treatment of plant extracts with foaming to release possible interacting Inv inhibitory protein(s) led to a large increase in activity for the Ler allele, but no changes for genotypes carrying the CVI allele. qRT-PCR analyses of two vac-Inv inhibitors in seedlings from parental and NIL genotypes revealed different expression patterns. Taken together, these results demonstrate that the vac-Inv QTL affects root biomass accumulation and also carbon partitioning through a differential regulation of vac-Inv inhibitors at the mRNA level. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. The dynamics of the outer edge of Saturn's A ring disturbed by Janus-Epimetheus

    NASA Astrophysics Data System (ADS)

    Renner, Stéfan; Santos Araujo, Nilton Carlos; Cooper, Nicholas; El Moutamid, Maryame; Murray, Carl; Sicardy, Bruno

    2016-10-01

    We developed an analytical model to study the dynamics of the outer edge of Saturn's A ring. The latter is influenced by 7:6 mean motion resonances with Janus and Epimetheus. Because of the horseshoe motion of the two co-orbital moons, the location of the resonances shift inwards or outwards every four years, making the ring edge particles alternately trapped in a corotation eccentricity resonance (CER) or a Lindblad eccentricity resonance (LER). However, the oscillation periods of the resonances are longer than the four-year interval between the switches in the orbits of Janus and Epimetheus.Averaged equations of motion are used, and our model is numerically integrated to describe the effects of the periodic sweeping of the 7:6 CER and LER over the ring edge region.We show that four radial zones (ranges 136715-136723, 136738-136749, 136756-136768, 136783-136791 km) are chaotic on decadal timescales, within which particle semimajor axes have periodic changes due to partial libration motions around the CER fixed points. After a few decades, the maximum variation of semimajor axis is about eleven (resp. three) kilometers in the case of the CER with Janus (resp. Epimetheus).Similarly, particle eccentricities have partial oscillations forced by the LERs every four years, and are in good agreement with the observed eccentricities (Spitale and Porco 2009, El Moutamid et al. 2015). For initially circular orbits, the maximum eccentricity reached (~0.001) corresponds to the value obtained from the classical theory of resonance (proportional to the cube root of the satellite-to-planet mass ratio).We notice that the fitted semimajor axes for the object recently discovered at the ring edge (Murray et al. 2014) are just outside the chaotic zone of radial range 136756-136768 km.We compare our results to Cassini observations, and discuss how the periodic LER/CER perturbations by Janus/Epimetheus may help to aggregate ring edge particles into clumps, as seen in high-resolution images.

  13. Fast-SG: an alignment-free algorithm for hybrid assembly.

    PubMed

    Di Genova, Alex; Ruz, Gonzalo A; Sagot, Marie-France; Maass, Alejandro

    2018-05-01

    Long-read sequencing technologies are the ultimate solution for genome repeats, allowing near reference-level reconstructions of large genomes. However, long-read de novo assembly pipelines are computationally intense and require a considerable amount of coverage, thereby hindering their broad application to the assembly of large genomes. Alternatively, hybrid assembly methods that combine short- and long-read sequencing technologies can reduce the time and cost required to produce de novo assemblies of large genomes. Here, we propose a new method, called Fast-SG, that uses a new ultrafast alignment-free algorithm specifically designed for constructing a scaffolding graph using light-weight data structures. Fast-SG can construct the graph from either short or long reads. This allows the reuse of efficient algorithms designed for short-read data and permits the definition of novel modular hybrid assembly pipelines. Using comprehensive standard datasets and benchmarks, we show how Fast-SG outperforms the state-of-the-art short-read aligners when building the scaffoldinggraph and can be used to extract linking information from either raw or error-corrected long reads. We also show how a hybrid assembly approach using Fast-SG with shallow long-read coverage (5X) and moderate computational resources can produce long-range and accurate reconstructions of the genomes of Arabidopsis thaliana (Ler-0) and human (NA12878). Fast-SG opens a door to achieve accurate hybrid long-range reconstructions of large genomes with low effort, high portability, and low cost.

  14. Simulation optimization of PSA-threshold based prostate cancer screening policies

    PubMed Central

    Zhang, Jingyu; Denton, Brian T.; Shah, Nilay D.; Inman, Brant A.

    2013-01-01

    We describe a simulation optimization method to design PSA screening policies based on expected quality adjusted life years (QALYs). Our method integrates a simulation model in a genetic algorithm which uses a probabilistic method for selection of the best policy. We present computational results about the efficiency of our algorithm. The best policy generated by our algorithm is compared to previously recommended screening policies. Using the policies determined by our model, we present evidence that patients should be screened more aggressively but for a shorter length of time than previously published guidelines recommend. PMID:22302420

  15. The centroidal algorithm in molecular similarity and diversity calculations on confidential datasets.

    PubMed

    Trepalin, Sergey; Osadchiy, Nikolay

    2005-01-01

    Chemical structure provides exhaustive description of a compound, but it is often proprietary and thus an impediment in the exchange of information. For example, structure disclosure is often needed for the selection of most similar or dissimilar compounds. Authors propose a centroidal algorithm based on structural fragments (screens) that can be efficiently used for the similarity and diversity selections without disclosing structures from the reference set. For an increased security purposes, authors recommend that such set contains at least some tens of structures. Analysis of reverse engineering feasibility showed that the problem difficulty grows with decrease of the screen's radius. The algorithm is illustrated with concrete calculations on known steroidal, quinoline, and quinazoline drugs. We also investigate a problem of scaffold identification in combinatorial library dataset. The results show that relatively small screens of radius equal to 2 bond lengths perform well in the similarity sorting, while radius 4 screens yield better results in diversity sorting. The software implementation of the algorithm taking SDF file with a reference set generates screens of various radii which are subsequently used for the similarity and diversity sorting of external SDFs. Since the reverse engineering of the reference set molecules from their screens has the same difficulty as the RSA asymmetric encryption algorithm, generated screens can be stored openly without further encryption. This approach ensures an end user transfers only a set of structural fragments and no other data. Like other algorithms of encryption, the centroid algorithm cannot give 100% guarantee of protecting a chemical structure from dataset, but probability of initial structure identification is very small-order of 10(-40) in typical cases.

  16. The centroidal algorithm in molecular similarity and diversity calculations on confidential datasets

    NASA Astrophysics Data System (ADS)

    Trepalin, Sergey; Osadchiy, Nikolay

    2005-09-01

    Chemical structure provides exhaustive description of a compound, but it is often proprietary and thus an impediment in the exchange of information. For example, structure disclosure is often needed for the selection of most similar or dissimilar compounds. Authors propose a centroidal algorithm based on structural fragments (screens) that can be efficiently used for the similarity and diversity selections without disclosing structures from the reference set. For an increased security purposes, authors recommend that such set contains at least some tens of structures. Analysis of reverse engineering feasibility showed that the problem difficulty grows with decrease of the screen's radius. The algorithm is illustrated with concrete calculations on known steroidal, quinoline, and quinazoline drugs. We also investigate a problem of scaffold identification in combinatorial library dataset. The results show that relatively small screens of radius equal to 2 bond lengths perform well in the similarity sorting, while radius 4 screens yield better results in diversity sorting. The software implementation of the algorithm taking SDF file with a reference set generates screens of various radii which are subsequently used for the similarity and diversity sorting of external SDFs. Since the reverse engineering of the reference set molecules from their screens has the same difficulty as the RSA asymmetric encryption algorithm, generated screens can be stored openly without further encryption. This approach ensures an end user transfers only a set of structural fragments and no other data. Like other algorithms of encryption, the centroid algorithm cannot give 100% guarantee of protecting a chemical structure from dataset, but probability of initial structure identification is very small-order of 10-40 in typical cases.

  17. The comparative and cost-effectiveness of HPV-based cervical cancer screening algorithms in El Salvador.

    PubMed

    Campos, Nicole G; Maza, Mauricio; Alfaro, Karla; Gage, Julia C; Castle, Philip E; Felix, Juan C; Cremer, Miriam L; Kim, Jane J

    2015-08-15

    Cervical cancer is the leading cause of cancer death among women in El Salvador. Utilizing data from the Cervical Cancer Prevention in El Salvador (CAPE) demonstration project, we assessed the health and economic impact of HPV-based screening and two different algorithms for the management of women who test HPV-positive, relative to existing Pap-based screening. We calibrated a mathematical model of cervical cancer to epidemiologic data from El Salvador and compared three screening algorithms for women aged 30-65 years: (i) HPV screening every 5 years followed by referral to colposcopy for HPV-positive women (Colposcopy Management [CM]); (ii) HPV screening every 5 years followed by treatment with cryotherapy for eligible HPV-positive women (Screen and Treat [ST]); and (iii) Pap screening every 2 years followed by referral to colposcopy for Pap-positive women (Pap). Potential harms and complications associated with overtreatment were not assessed. Under base case assumptions of 65% screening coverage, HPV-based screening was more effective than Pap, reducing cancer risk by ∼ 60% (Pap: 50%). ST was the least costly strategy, and cost $2,040 per year of life saved. ST remained the most attractive strategy as visit compliance, costs, coverage, and test performance were varied. We conclude that a screen-and-treat algorithm within an HPV-based screening program is very cost-effective in El Salvador, with a cost-effectiveness ratio below per capita GDP. © 2015 UICC.

  18. Screening for Human Immunodeficiency Virus, Hepatitis B Virus, Hepatitis C Virus, and Treponema pallidum by Blood Testing Using a Bio-Flash Technology-Based Algorithm before Gastrointestinal Endoscopy

    PubMed Central

    Zhen, Chen; QuiuLi, Zhang; YuanQi, An; Casado, Verónica Vocero; Fan, Yuan

    2016-01-01

    Currently, conventional enzyme immunoassays which use manual gold immunoassays and colloidal tests (GICTs) are used as screening tools to detect Treponema pallidum (syphilis), hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus type 1 (HIV-1), and HIV-2 in patients undergoing surgery. The present observational, cross-sectional study compared the sensitivity, specificity, and work flow characteristics of the conventional algorithm with manual GICTs with those of a newly proposed algorithm that uses the automated Bio-Flash technology as a screening tool in patients undergoing gastrointestinal (GI) endoscopy. A total of 956 patients were examined for the presence of serological markers of infection with HIV-1/2, HCV, HBV, and T. pallidum. The proposed algorithm with the Bio-Flash technology was superior for the detection of all markers (100.0% sensitivity and specificity for detection of anti-HIV and anti-HCV antibodies, HBV surface antigen [HBsAg], and T. pallidum) compared with the conventional algorithm based on the manual method (80.0% sensitivity and 98.6% specificity for the detection of anti-HIV, 75.0% sensitivity for the detection of anti-HCV, 94.7% sensitivity for the detection of HBsAg, and 100% specificity for the detection of anti-HCV and HBsAg) in these patients. The automated Bio-Flash technology-based screening algorithm also reduced the operation time by 85.0% (205 min) per day, saving up to 24 h/week. In conclusion, the use of the newly proposed screening algorithm based on the automated Bio-Flash technology can provide an advantage over the use of conventional algorithms based on manual methods for screening for HIV, HBV, HCV, and syphilis before GI endoscopy. PMID:27707942

  19. Screening for Human Immunodeficiency Virus, Hepatitis B Virus, Hepatitis C Virus, and Treponema pallidum by Blood Testing Using a Bio-Flash Technology-Based Algorithm before Gastrointestinal Endoscopy.

    PubMed

    Jun, Zhou; Zhen, Chen; QuiuLi, Zhang; YuanQi, An; Casado, Verónica Vocero; Fan, Yuan

    2016-12-01

    Currently, conventional enzyme immunoassays which use manual gold immunoassays and colloidal tests (GICTs) are used as screening tools to detect Treponema pallidum (syphilis), hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus type 1 (HIV-1), and HIV-2 in patients undergoing surgery. The present observational, cross-sectional study compared the sensitivity, specificity, and work flow characteristics of the conventional algorithm with manual GICTs with those of a newly proposed algorithm that uses the automated Bio-Flash technology as a screening tool in patients undergoing gastrointestinal (GI) endoscopy. A total of 956 patients were examined for the presence of serological markers of infection with HIV-1/2, HCV, HBV, and T. pallidum The proposed algorithm with the Bio-Flash technology was superior for the detection of all markers (100.0% sensitivity and specificity for detection of anti-HIV and anti-HCV antibodies, HBV surface antigen [HBsAg], and T. pallidum) compared with the conventional algorithm based on the manual method (80.0% sensitivity and 98.6% specificity for the detection of anti-HIV, 75.0% sensitivity for the detection of anti-HCV, 94.7% sensitivity for the detection of HBsAg, and 100% specificity for the detection of anti-HCV and HBsAg) in these patients. The automated Bio-Flash technology-based screening algorithm also reduced the operation time by 85.0% (205 min) per day, saving up to 24 h/week. In conclusion, the use of the newly proposed screening algorithm based on the automated Bio-Flash technology can provide an advantage over the use of conventional algorithms based on manual methods for screening for HIV, HBV, HCV, and syphilis before GI endoscopy. Copyright © 2016 Jun et al.

  20. High Pressure Coolant Injection system risk-based inspection guide for Hatch Nuclear Power Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiBiasio, A.M.

    1993-05-01

    A review of the operating experience for the High Pressure Coolant Injection (HPCI) system at the Hatch Nuclear Power Station, Units 1 and 2, is described in this report. The information for this review was obtained from Hatch Licensee Event Reports (LERs) that were generated between 1980 and 1992. These LERs have been categorized into 23 failure modes that have been prioritized based on probabilistic risk assessment considerations. In addition, the results of the Hatch operating experience review have been compared with the results of a similar, industry wide operating, experience review. This comparison provides an indication of areas inmore » the Hatch HPCI system that should be given increased attention in the prioritization of inspection resources.« less

  1. Costs per Diagnosis of Acute HIV Infection in Community-based Screening Strategies: A Comparative Analysis of Four Screening Algorithms.

    PubMed

    Hoenigl, Martin; Graff-Zivin, Joshua; Little, Susan J

    2016-02-15

    In nonhealthcare settings, widespread screening for acute human immunodeficiency virus (HIV) infection (AHI) is limited by cost and decision algorithms to better prioritize use of resources. Comparative cost analyses for available strategies are lacking. To determine cost-effectiveness of community-based testing strategies, we evaluated annual costs of 3 algorithms that detect AHI based on HIV nucleic acid amplification testing (EarlyTest algorithm) or on HIV p24 antigen (Ag) detection via Architect (Architect algorithm) or Determine (Determine algorithm) as well as 1 algorithm that relies on HIV antibody testing alone (Antibody algorithm). The cost model used data on men who have sex with men (MSM) undergoing community-based AHI screening in San Diego, California. Incremental cost-effectiveness ratios (ICERs) per diagnosis of AHI were calculated for programs with HIV prevalence rates between 0.1% and 2.9%. Among MSM in San Diego, EarlyTest was cost-savings (ie, ICERs per AHI diagnosis less than $13.000) when compared with the 3 other algorithms. Cost analyses relative to regional HIV prevalence showed that EarlyTest was cost-effective (ie, ICERs less than $69.547) for similar populations of MSM with an HIV prevalence rate >0.4%; Architect was the second best alternative for HIV prevalence rates >0.6%. Identification of AHI by the dual EarlyTest screening algorithm is likely to be cost-effective not only among at-risk MSM in San Diego but also among similar populations of MSM with HIV prevalence rates >0.4%. © The Author 2015. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  2. Road screening and distribution route multi-objective robust optimization for hazardous materials based on neural network and genetic algorithm.

    PubMed

    Ma, Changxi; Hao, Wei; Pan, Fuquan; Xiang, Wang

    2018-01-01

    Route optimization of hazardous materials transportation is one of the basic steps in ensuring the safety of hazardous materials transportation. The optimization scheme may be a security risk if road screening is not completed before the distribution route is optimized. For road screening issues of hazardous materials transportation, a road screening algorithm of hazardous materials transportation is built based on genetic algorithm and Levenberg-Marquardt neural network (GA-LM-NN) by analyzing 15 attributes data of each road network section. A multi-objective robust optimization model with adjustable robustness is constructed for the hazardous materials transportation problem of single distribution center to minimize transportation risk and time. A multi-objective genetic algorithm is designed to solve the problem according to the characteristics of the model. The algorithm uses an improved strategy to complete the selection operation, applies partial matching cross shift and single ortho swap methods to complete the crossover and mutation operation, and employs an exclusive method to construct Pareto optimal solutions. Studies show that the sets of hazardous materials transportation road can be found quickly through the proposed road screening algorithm based on GA-LM-NN, whereas the distribution route Pareto solutions with different levels of robustness can be found rapidly through the proposed multi-objective robust optimization model and algorithm.

  3. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study

    PubMed Central

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien

    2017-01-01

    Background Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. Objective The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. Methods We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Results Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician’s ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. Conclusions AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. PMID:28951384

  4. Cost-effectiveness of WHO-Recommended Algorithms for TB Case Finding at Ethiopian HIV Clinics.

    PubMed

    Adelman, Max W; McFarland, Deborah A; Tsegaye, Mulugeta; Aseffa, Abraham; Kempker, Russell R; Blumberg, Henry M

    2018-01-01

    The World Health Organization (WHO) recommends active tuberculosis (TB) case finding and a rapid molecular diagnostic test (Xpert MTB/RIF) to detect TB among people living with HIV (PLHIV) in high-burden settings. Information on the cost-effectiveness of these recommended strategies is crucial for their implementation. We conducted a model-based cost-effectiveness analysis comparing 2 algorithms for TB screening and diagnosis at Ethiopian HIV clinics: (1) WHO-recommended symptom screen combined with Xpert for PLHIV with a positive symptom screen and (2) current recommended practice algorithm (CRPA; based on symptom screening, smear microscopy, and clinical TB diagnosis). Our primary outcome was US$ per disability-adjusted life-year (DALY) averted. Secondary outcomes were additional true-positive diagnoses, and false-negative and false-positive diagnoses averted. Compared with CRPA, combining a WHO-recommended symptom screen with Xpert was highly cost-effective (incremental cost of $5 per DALY averted). Among a cohort of 15 000 PLHIV with a TB prevalence of 6% (900 TB cases), this algorithm detected 8 more true-positive cases than CRPA, and averted 2045 false-positive and 8 false-negative diagnoses compared with CRPA. The WHO-recommended algorithm was marginally costlier ($240 000) than CRPA ($239 000). In sensitivity analysis, the symptom screen/Xpert algorithm was dominated at low Xpert sensitivity (66%). In this model-based analysis, combining a WHO-recommended symptom screen with Xpert for TB diagnosis among PLHIV was highly cost-effective ($5 per DALY averted) and more sensitive than CRPA in a high-burden, resource-limited setting.

  5. The cost-effectiveness of 10 antenatal syphilis screening and treatment approaches in Peru, Tanzania, and Zambia.

    PubMed

    Terris-Prestholt, Fern; Vickerman, Peter; Torres-Rueda, Sergio; Santesso, Nancy; Sweeney, Sedona; Mallma, Patricia; Shelley, Katharine D; Garcia, Patricia J; Bronzan, Rachel; Gill, Michelle M; Broutet, Nathalie; Wi, Teodora; Watts, Charlotte; Mabey, David; Peeling, Rosanna W; Newman, Lori

    2015-06-01

    Rapid plasma reagin (RPR) is frequently used to test women for maternal syphilis. Rapid syphilis immunochromatographic strip tests detecting only Treponema pallidum antibodies (single RSTs) or both treponemal and non-treponemal antibodies (dual RSTs) are now available. This study assessed the cost-effectiveness of algorithms using these tests to screen pregnant women. Observed costs of maternal syphilis screening and treatment using clinic-based RPR and single RSTs in 20 clinics across Peru, Tanzania, and Zambia were used to model the cost-effectiveness of algorithms using combinations of RPR, single, and dual RSTs, and no and mass treatment. Sensitivity analyses determined drivers of key results. Although this analysis found screening using RPR to be relatively cheap, most (>70%) true cases went untreated. Algorithms using single RSTs were the most cost-effective in all observed settings, followed by dual RSTs, which became the most cost-effective if dual RST costs were halved. Single test algorithms dominated most sequential testing algorithms, although sequential algorithms reduced overtreatment. Mass treatment was relatively cheap and effective in the absence of screening supplies, though treated many uninfected women. This analysis highlights the advantages of introducing RSTs in three diverse settings. The results should be applicable to other similar settings. Copyright © 2015 International Federation of Gynecology and Obstetrics. All rights reserved.

  6. The cost-effectiveness of 10 antenatal syphilis screening and treatment approaches in Peru, Tanzania, and Zambia

    PubMed Central

    Terris-Prestholt, Fern; Vickerman, Peter; Torres-Rueda, Sergio; Santesso, Nancy; Sweeney, Sedona; Mallma, Patricia; Shelley, Katharine D.; Garcia, Patricia J.; Bronzan, Rachel; Gill, Michelle M.; Broutet, Nathalie; Wi, Teodora; Watts, Charlotte; Mabey, David; Peeling, Rosanna W.; Newman, Lori

    2015-01-01

    Objective Rapid plasma reagin (RPR) is frequently used to test women for maternal syphilis. Rapid syphilis immunochromatographic strip tests detecting only Treponema pallidum antibodies (single RSTs) or both treponemal and non-treponemal antibodies (dual RSTs) are now available. This study assessed the cost-effectiveness of algorithms using these tests to screen pregnant women. Methods Observed costs of maternal syphilis screening and treatment using clinic-based RPR and single RSTs in 20 clinics across Peru, Tanzania, and Zambia were used to model the cost-effectiveness of algorithms using combinations of RPR, single, and dual RSTs, and no and mass treatment. Sensitivity analyses determined drivers of key results. Results Although this analysis found screening using RPR to be relatively cheap, most (> 70%) true cases went untreated. Algorithms using single RSTs were the most cost-effective in all observed settings, followed by dual RSTs, which became the most cost-effective if dual RST costs were halved. Single test algorithms dominated most sequential testing algorithms, although sequential algorithms reduced overtreatment. Mass treatment was relatively cheap and effective in the absence of screening supplies, though treated many uninfected women. Conclusion This analysis highlights the advantages of introducing RSTs in three diverse settings. The results should be applicable to other similar settings. PMID:25963907

  7. Experimental study of contact edge roughness on sub-100 nm various circular shapes

    NASA Astrophysics Data System (ADS)

    Lee, Tae Y.; Ihm, Dongchul; Kang, Hyo C.; Lee, Jum B.; Lee, Byoung H.; Chin, Soo B.; Cho, Do H.; Song, Chang L.

    2005-05-01

    The measurement of edge roughness has become a hot issue in the semiconductor industry. Especially the contact roughness is being more critical as design rule shrinks. Major vendors offer a variety of features to measure the edge roughness in their CD-SEMs. For the line and space patterns, features such as Line Edge Roughness (LER) and Line Width Roughness (LWR) are available in current CD-SEMs. However the features currently available in commercial CD-SEM cannot provide a proper solution in monitoring the contact roughness. We had introduced a new parameter R, measurement algorithm and definition of contact edge roughness to quantify CER and CSR in previous paper. The parameter, R could provide an alternative solution to monitor contact or island pattern roughness. In this paper, we investigated to assess optimum number of CD measurement (1-D) and fitting method for CER or CSR. The study was based on a circular contact shape. Some new ideas to quantify CER or CSR were also suggested with preliminary experimental results.

  8. Receptor-targeted liposome-peptide-siRNA nanoparticles represent an efficient delivery system for MRTF silencing in conjunctival fibrosis

    NASA Astrophysics Data System (ADS)

    Yu-Wai-Man, Cynthia; Tagalakis, Aristides D.; Manunta, Maria D.; Hart, Stephen L.; Khaw, Peng T.

    2016-02-01

    There is increasing evidence that the Myocardin-related transcription factor/Serum response factor (MRTF/SRF) pathway plays a key role in fibroblast activation and that knocking down MRTF can lead to reduced scarring and fibrosis. Here, we have developed a receptor-targeted liposome-peptide-siRNA nanoparticle as a non-viral delivery system for MRTF-B siRNA in conjunctival fibrosis. Using 50 nM siRNA, the MRTF-B gene was efficiently silenced by 76% and 72% with LYR and LER nanoparticles, respectively. The silencing efficiency was low when non-targeting peptides or siRNA alone or liposome-siRNA alone were used. LYR and LER nanoparticles also showed higher silencing efficiency than PEGylated LYR-P and LER-P nanoparticles. The nanoparticles were not cytotoxic using different liposomes, targeting peptides, and 50 nM siRNA. Three-dimensional fibroblast-populated collagen matrices were also used as a functional assay to measure contraction in vitro, and showed that MRTF-B LYR nanoparticles completely blocked matrix contraction after a single transfection treatment. In conclusion, this is the first study to develop and show that receptor-targeted liposome-peptide-siRNA nanoparticles represent an efficient and safe non-viral siRNA delivery system that could be used to prevent fibrosis after glaucoma filtration surgery and other contractile scarring conditions in the eye.

  9. Mu-driven transposition of recombinant mini-Mu unit DNA in the Corynebacterium glutamicum chromosome.

    PubMed

    Gorshkova, Natalya V; Lobanova, Juliya S; Tokmakova, Irina L; Smirnov, Sergey V; Akhverdyan, Valerii Z; Krylov, Alexander A; Mashko, Sergey V

    2018-03-01

    A dual-component Mu-transposition system was modified for the integration/amplification of genes in Corynebacterium. The system consists of two types of plasmids: (i) a non-replicative integrative plasmid that contains the transposing mini-Mu(LR) unit bracketed by the L/R Mu ends or the mini-Mu(LER) unit, which additionally contains the enhancer element, E, and (ii) an integration helper plasmid that expresses the transposition factor genes for MuA and MuB. Efficient transposition in the C. glutamicum chromosome (≈ 2 × 10 -4 per cell) occurred mainly through the replicative pathway via cointegrate formation followed by possible resolution. Optimizing the E location in the mini-Mu unit significantly increased the efficiency of Mu-driven intramolecular transposition-amplification in C. glutamicum as well as in gram-negative bacteria. The new C. glutamicum genome modification strategy that was developed allows the consequent independent integration/amplification/fixation of target genes at high copy numbers. After integration/amplification of the first mini-Mu(LER) unit in the C. glutamicum chromosome, the E-element, which is bracketed by lox-like sites, is excised by Cre-mediated fashion, thereby fixing the truncated mini-Mu(LR) unit in its position for the subsequent integration/amplification of new mini-Mu(LER) units. This strategy was demonstrated using the genes for the citrine and green fluorescent proteins, yECitrine and yEGFP, respectively.

  10. Rice leaf growth and water potential are resilient to evaporative demand and soil water deficit once the effects of root system are neutralized.

    PubMed

    Parent, Boris; Suard, Benoît; Serraj, Rachid; Tardieu, François

    2010-08-01

    Rice is known to be sensitive to soil water deficit and evaporative demand, with a greatest sensitivity of lowland-adapted genotypes. We have analysed the responses of plant water relations and of leaf elongation rate (LER) to soil water status and evaporative demand in seven rice genotypes belonging to different species, subspecies, either upland- or lowland-adapted. In the considered range of soil water potential (0 to -0.6 MPa), stomatal conductance was controlled in such a way that the daytime leaf water potential was similar in well-watered, droughted or flooded conditions (isohydric behaviour). A low sensitivity of LER to evaporative demand was observed in the same three conditions, with small differences between genotypes and lower sensitivity than in maize. The sensitivity of LER to soil water deficit was similar to that of maize. A tendency towards lower sensitivities was observed in upland than lowland genotypes but with smaller differences than expected. We conclude that leaf water status and leaf elongation of rice are not particularly sensitive to water deficit. The main origin of drought sensitivity in rice may be its poor root system, whose effect was alleviated in the study presented here by growing plants in pots whose soil was entirely colonized by roots of all genotypes.

  11. Comparison of cytology, HPV DNA testing and HPV 16/18 genotyping alone or combined targeting to the more balanced methodology for cervical cancer screening.

    PubMed

    Chatzistamatiou, Kimon; Moysiadis, Theodoros; Moschaki, Viktoria; Panteleris, Nikolaos; Agorastos, Theodoros

    2016-07-01

    The objective of the present study was to identify the most effective cervical cancer screening algorithm incorporating different combinations of cytology, HPV testing and genotyping. Women 25-55years old recruited for the "HERMES" (HEllenic Real life Multicentric cErvical Screening) study were screened in terms of cytology and high-risk (hr) HPV testing with HPV 16/18 genotyping. Women positive for cytology or/and hrHPV were referred for colposcopy, biopsy and treatment. Ten screening algorithms based on different combinations of cytology, HPV testing and HPV 16/18 genotyping were investigated in terms of diagnostic accuracy. Three clusters of algorithms were formed according to the balance between effectiveness and harm caused by screening. The cluster showing the best balance included two algorithms based on co-testing and two based on HPV primary screening with HPV 16/18 genotyping. Among these, hrHPV testing with HPV 16/18 genotyping and reflex cytology (atypical squamous cells of undetermined significance - ASCUS threshold) presented the optimal combination of sensitivity (82.9%) and specificity relative to cytology alone (0.99) with 1.26 false positive rate relative to cytology alone. HPV testing with HPV 16/18 genotyping, referring HPV 16/18 positive women directly to colposcopy, and hrHPV (non 16/18) positive women to reflex cytology (ASCUS threshold), as a triage method to colposcopy, reflects the best equilibrium between screening effectiveness and harm. Algorithms, based on cytology as initial screening method, on co-testing or HPV primary without genotyping, and on HPV primary with genotyping but without cytology triage, are not supported according to the present analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study.

    PubMed

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien; Hwang, Juey-Jen; Ho, Yi-Lwun

    2017-09-26

    Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician's ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. ©Ying-Hsien Chen, Chi-Sheng Hung, Ching-Chang Huang, Yu-Chien Hung, Juey-Jen Hwang, Yi-Lwun Ho. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.09.2017.

  13. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  14. A permutation-based non-parametric analysis of CRISPR screen data.

    PubMed

    Jia, Gaoxiang; Wang, Xinlei; Xiao, Guanghua

    2017-07-19

    Clustered regularly-interspaced short palindromic repeats (CRISPR) screens are usually implemented in cultured cells to identify genes with critical functions. Although several methods have been developed or adapted to analyze CRISPR screening data, no single specific algorithm has gained popularity. Thus, rigorous procedures are needed to overcome the shortcomings of existing algorithms. We developed a Permutation-Based Non-Parametric Analysis (PBNPA) algorithm, which computes p-values at the gene level by permuting sgRNA labels, and thus it avoids restrictive distributional assumptions. Although PBNPA is designed to analyze CRISPR data, it can also be applied to analyze genetic screens implemented with siRNAs or shRNAs and drug screens. We compared the performance of PBNPA with competing methods on simulated data as well as on real data. PBNPA outperformed recent methods designed for CRISPR screen analysis, as well as methods used for analyzing other functional genomics screens, in terms of Receiver Operating Characteristics (ROC) curves and False Discovery Rate (FDR) control for simulated data under various settings. Remarkably, the PBNPA algorithm showed better consistency and FDR control on published real data as well. PBNPA yields more consistent and reliable results than its competitors, especially when the data quality is low. R package of PBNPA is available at: https://cran.r-project.org/web/packages/PBNPA/ .

  15. Variable screening via quantile partial correlation

    PubMed Central

    Ma, Shujie; Tsai, Chih-Ling

    2016-01-01

    In quantile linear regression with ultra-high dimensional data, we propose an algorithm for screening all candidate variables and subsequently selecting relevant predictors. Specifically, we first employ quantile partial correlation for screening, and then we apply the extended Bayesian information criterion (EBIC) for best subset selection. Our proposed method can successfully select predictors when the variables are highly correlated, and it can also identify variables that make a contribution to the conditional quantiles but are marginally uncorrelated or weakly correlated with the response. Theoretical results show that the proposed algorithm can yield the sure screening set. By controlling the false selection rate, model selection consistency can be achieved theoretically. In practice, we proposed using EBIC for best subset selection so that the resulting model is screening consistent. Simulation studies demonstrate that the proposed algorithm performs well, and an empirical example is presented. PMID:28943683

  16. The performance of the SEPT9 gene methylation assay and a comparison with other CRC screening tests: A meta-analysis.

    PubMed

    Song, Lele; Jia, Jia; Peng, Xiumei; Xiao, Wenhua; Li, Yuemin

    2017-06-08

    The SEPT9 gene methylation assay is the first FDA-approved blood assay for colorectal cancer (CRC) screening. Fecal immunochemical test (FIT), FIT-DNA test and CEA assay are also in vitro diagnostic (IVD) tests used in CRC screening. This meta-analysis aims to review the SEPT9 assay performance and compare it with other IVD CRC screening tests. By searching the Ovid MEDLINE, EMBASE, CBMdisc and CJFD database, 25 out of 180 studies were identified to report the SEPT9 assay performance. 2613 CRC cases and 6030 controls were included, and sensitivity and specificity were used to evaluate its performance at various algorithms. 1/3 algorithm exhibited the best sensitivity while 2/3 and 1/1 algorithm exhibited the best balance between sensitivity and specificity. The performance of the blood SEPT9 assay is superior to that of the serum protein markers and the FIT test in symptomatic population, while appeared to be less potent than FIT and FIT-DNA tests in asymptomatic population. In conclusion, 1/3 algorithm is recommended for CRC screening, and 2/3 or 1/1 algorithms are suitable for early detection for diagnostic purpose. The SEPT9 assay exhibited better performance in symptomatic population than in asymptomatic population.

  17. Pilot study analyzing automated ECG screening of hypertrophic cardiomyopathy.

    PubMed

    Campbell, Matthew J; Zhou, Xuefu; Han, Chia; Abrishami, Hedayat; Webster, Gregory; Miyake, Christina Y; Sower, Christopher T; Anderson, Jeffrey B; Knilans, Timothy K; Czosek, Richard J

    2017-06-01

    Hypertrophic cardiomyopathy (HCM) is one of the leading causes of sudden cardiac death in athletes. However, preparticipation ECG screening has often been criticized for failing to meet cost-effectiveness thresholds, in part because of high false-positive rates and the cost of ECG screening itself. The purpose of this study was to assess the testing characteristics of an automated ECG algorithm designed to screen for HCM in a multi-institutional pediatric cohort. ECGs from patients with HCM aged 12 to 20 years from 3 pediatric institutions were screened for ECG criteria for HCM using a previously described automated computer algorithm developed specifically for HCM ECG screening. The results were compared to a known healthy pediatric cohort. The studies then were read by trained electrophysiologists using standard ECG criteria and compared to the results of automated screening. One hundred twenty-eight ECGs from unique patients with phenotypic HCM were obtained and compared with 256 studies from healthy control patients matched in 2:1 fashion. When presented with the ECGs, the non-voltage-based algorithm resulted in 81.2% sensitivity and 90.7% specificity. A trained electrophysiologist read the same data according to the Seattle Criteria, with 71% sensitivity with 95.7% specificity. The sensitivity of screening as well as the components of the ECG screening itself varied by institution. This pilot study demonstrates a potential for automated ECG screening algorithms to detect HCM with testing characteristics similar to that of a trained electrophysiologist. In addition, there appear to be differences in ECG characteristics between patient populations, which may account for the difficulties in universal screening. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  18. Orbiting Carbon Observatory-2 (OCO-2) cloud screening algorithms: validation against collocated MODIS and CALIOP data

    NASA Astrophysics Data System (ADS)

    Taylor, Thomas E.; O'Dell, Christopher W.; Frankenberg, Christian; Partain, Philip T.; Cronk, Heather Q.; Savtchenko, Andrey; Nelson, Robert R.; Rosenthal, Emily J.; Chang, Albert Y.; Fisher, Brenden; Osterman, Gregory B.; Pollock, Randy H.; Crisp, David; Eldering, Annmarie; Gunson, Michael R.

    2016-03-01

    The objective of the National Aeronautics and Space Administration's (NASA) Orbiting Carbon Observatory-2 (OCO-2) mission is to retrieve the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2) from satellite measurements of reflected sunlight in the near-infrared. These estimates can be biased by clouds and aerosols, i.e., contamination, within the instrument's field of view. Screening of the most contaminated soundings minimizes unnecessary calls to the computationally expensive Level 2 (L2) XCO2 retrieval algorithm. Hence, robust cloud screening methods have been an important focus of the OCO-2 algorithm development team. Two distinct, computationally inexpensive cloud screening algorithms have been developed for this application. The A-Band Preprocessor (ABP) retrieves the surface pressure using measurements in the 0.76 µm O2 A band, neglecting scattering by clouds and aerosols, which introduce photon path-length differences that can cause large deviations between the expected and retrieved surface pressure. The Iterative Maximum A Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) Preprocessor (IDP) retrieves independent estimates of the CO2 and H2O column abundances using observations taken at 1.61 µm (weak CO2 band) and 2.06 µm (strong CO2 band), while neglecting atmospheric scattering. The CO2 and H2O column abundances retrieved in these two spectral regions differ significantly in the presence of cloud and scattering aerosols. The combination of these two algorithms, which are sensitive to different features in the spectra, provides the basis for cloud screening of the OCO-2 data set.To validate the OCO-2 cloud screening approach, collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, were compared to results from the two OCO-2 cloud screening algorithms. With tuning of algorithmic threshold parameters that allows for processing of ≃ 20-25 % of all OCO-2 soundings, agreement between the OCO-2 and MODIS cloud screening methods is found to be ≃ 85 % over four 16-day orbit repeat cycles in both the winter (December) and spring (April-May) for OCO-2 nadir-land, glint-land and glint-water observations.No major, systematic, spatial or temporal dependencies were found, although slight differences in the seasonal data sets do exist and validation is more problematic with increasing solar zenith angle and when surfaces are covered in snow and ice and have complex topography. To further analyze the performance of the cloud screening algorithms, an initial comparison of OCO-2 observations was made to collocated measurements from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) aboard the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO). These comparisons highlight the strength of the OCO-2 cloud screening algorithms in identifying high, thin clouds but suggest some difficulty in identifying some clouds near the surface, even when the optical thicknesses are greater than 1.

  19. Inhomogeneity of PAGs in resist film studied by molecular-dynamics simulations for EUV lithography

    NASA Astrophysics Data System (ADS)

    Toriumi, Minoru; Itani, Toshiro

    2014-03-01

    EUV resist materials are requested simultaneously to improve the resolution, line-edge roughness (LER), and sensitivity (RLS). In a resist film inhomogeneous structures in nanometer region may have large effects on directly the resolution and LER and indirectly on sensitivity. Inhomogeneity of PAGs in a hybrid resist for EUV lithography was investigated using molecular dynamics simulations. The hybrid resist film showed the inhomogeneous positions and motions of PAG cations and anions. Free volumes in resist matrix influence the motions of PAGs. Molecular structure such as bulky phenyl groups of a PAG cation localize the positions and reduce the motion of a cation. Chemical properties such as ionic interactions and lone-pair interaction also play an important role to determine the inhomogeneity of PAGs. Fluorine interaction enables active motions of PAG anions.

  20. Self-aligned blocking integration demonstration for critical sub-30nm pitch Mx level patterning with EUV self-aligned double patterning

    NASA Astrophysics Data System (ADS)

    Raley, Angélique; Lee, Joe; Smith, Jeffrey T.; Sun, Xinghua; Farrell, Richard A.; Shearer, Jeffrey; Xu, Yongan; Ko, Akiteru; Metz, Andrew W.; Biolsi, Peter; Devilliers, Anton; Arnold, John; Felix, Nelson

    2018-04-01

    We report a sub-30nm pitch self-aligned double patterning (SADP) integration scheme with EUV lithography coupled with self-aligned block technology (SAB) targeting the back end of line (BEOL) metal line patterning applications for logic nodes beyond 5nm. The integration demonstration is a validation of the scalability of a previously reported flow, which used 193nm immersion SADP targeting a 40nm pitch with the same material sets (Si3N4 mandrel, SiO2 spacer, Spin on carbon, spin on glass). The multi-color integration approach is successfully demonstrated and provides a valuable method to address overlay concerns and more generally edge placement error (EPE) as a whole for advanced process nodes. Unbiased LER/LWR analysis comparison between EUV SADP and 193nm immersion SADP shows that both integrations follow the same trend throughout the process steps. While EUV SADP shows increased LER after mandrel pull, metal hardmask open and dielectric etch compared to 193nm immersion SADP, the final process performance is matched in terms of LWR (1.08nm 3 sigma unbiased) and is only 6% higher than 193nm immersion SADP for average unbiased LER. Using EUV SADP enables almost doubling the line density while keeping most of the remaining processes and films unchanged, and provides a compelling alternative to other multipatterning integrations, which present their own sets of challenges.

  1. Ascertainment and verification of end-stage renal disease and end-stage liver disease in the north american AIDS cohort collaboration on research and design.

    PubMed

    Kitahata, Mari M; Drozd, Daniel R; Crane, Heidi M; Van Rompaey, Stephen E; Althoff, Keri N; Gange, Stephen J; Klein, Marina B; Lucas, Gregory M; Abraham, Alison G; Lo Re, Vincent; McReynolds, Justin; Lober, William B; Mendes, Adell; Modur, Sharada P; Jing, Yuezhou; Morton, Elizabeth J; Griffith, Margaret A; Freeman, Aimee M; Moore, Richard D

    2015-01-01

    The burden of HIV disease has shifted from traditional AIDS-defining illnesses to serious non-AIDS-defining comorbid conditions. Research aimed at improving HIV-related comorbid disease outcomes requires well-defined, verified clinical endpoints. We developed methods to ascertain and verify end-stage renal disease (ESRD) and end-stage liver disease (ESLD) and validated screening algorithms within the largest HIV cohort collaboration in North America (NA-ACCORD). Individuals who screened positive among all participants in twelve cohorts enrolled between January 1996 and December 2009 underwent medical record review to verify incident ESRD or ESLD using standardized protocols. We randomly sampled 6% of contributing cohorts to determine the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of ESLD and ESRD screening algorithms in a validation subcohort. Among 43,433 patients screened for ESRD, 822 screened positive of which 620 met clinical criteria for ESRD. The algorithm had 100% sensitivity, 99% specificity, 82% PPV, and 100% NPV for ESRD. Among 41,463 patients screened for ESLD, 2,024 screened positive of which 645 met diagnostic criteria for ESLD. The algorithm had 100% sensitivity, 95% specificity, 27% PPV, and 100% NPV for ESLD. Our methods proved robust for ascertainment of ESRD and ESLD in persons infected with HIV.

  2. Cloud Screening and Quality Control Algorithm for Star Photometer Data: Assessment with Lidar Measurements and with All-sky Images

    NASA Technical Reports Server (NTRS)

    Ramirez, Daniel Perez; Lyamani, H.; Olmo, F. J.; Whiteman, D. N.; Navas-Guzman, F.; Alados-Arboledas, L.

    2012-01-01

    This paper presents the development and set up of a cloud screening and data quality control algorithm for a star photometer based on CCD camera as detector. These algorithms are necessary for passive remote sensing techniques to retrieve the columnar aerosol optical depth, delta Ae(lambda), and precipitable water vapor content, W, at nighttime. This cloud screening procedure consists of calculating moving averages of delta Ae() and W under different time-windows combined with a procedure for detecting outliers. Additionally, to avoid undesirable Ae(lambda) and W fluctuations caused by the atmospheric turbulence, the data are averaged on 30 min. The algorithm is applied to the star photometer deployed in the city of Granada (37.16 N, 3.60 W, 680 ma.s.l.; South-East of Spain) for the measurements acquired between March 2007 and September 2009. The algorithm is evaluated with correlative measurements registered by a lidar system and also with all-sky images obtained at the sunset and sunrise of the previous and following days. Promising results are obtained detecting cloud-affected data. Additionally, the cloud screening algorithm has been evaluated under different aerosol conditions including Saharan dust intrusion, biomass burning and pollution events.

  3. [Prenatal risk calculation: comparison between Fast Screen pre I plus software and ViewPoint software. Evaluation of the risk calculation algorithms].

    PubMed

    Morin, Jean-François; Botton, Eléonore; Jacquemard, François; Richard-Gireme, Anouk

    2013-01-01

    The Fetal medicine foundation (FMF) has developed a new algorithm called Prenatal Risk Calculation (PRC) to evaluate Down syndrome screening based on free hCGβ, PAPP-A and nuchal translucency. The peculiarity of this algorithm is to use the degree of extremeness (DoE) instead of the multiple of the median (MoM). The biologists measuring maternal seric markers on Kryptor™ machines (Thermo Fisher Scientific) use Fast Screen pre I plus software for the prenatal risk calculation. This software integrates the PRC algorithm. Our study evaluates the data of 2.092 patient files of which 19 show a fœtal abnormality. These files have been first evaluated with the ViewPoint software based on MoM. The link between DoE and MoM has been analyzed and the different calculated risks compared. The study shows that Fast Screen pre I plus software gives the same risk results as ViewPoint software, but yields significantly fewer false positive results.

  4. The inclusion of N-terminal pro-brain natriuretic peptide in a sensitive screening strategy for systemic sclerosis-related pulmonary arterial hypertension: a cohort study

    PubMed Central

    2013-01-01

    Introduction Pulmonary arterial hypertension (PAH) is a major cause of mortality in systemic sclerosis (SSc). Screening guidelines for PAH recommend multiple investigations, including annual echocardiography, which together have low specificity and may not be cost-effective. We sought to evaluate the predictive accuracy of serum N-terminal pro-brain natriuretic peptide (NT-proBNP) in combination with pulmonary function tests (PFT) (‘proposed’ algorithm) in a screening algorithm for SSc-PAH. Methods We evaluated our proposed algorithm (PFT with NT-proBNP) on 49 consecutive SSc patients with suspected pulmonary hypertension undergoing right heart catherisation (RHC). The predictive accuracy of the proposed algorithm was compared with existing screening recommendations, and is presented as sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). Results Overall, 27 patients were found to have pulmonary hypertension (PH) at RHC, while 22 had no PH. The sensitivity, specificity, PPV and NPV of the proposed algorithm for PAH was 94.1%, 54.5%, 61.5% and 92.3%, respectively; current European Society of Cardiology (ESC)/European Respiratory Society (ERS) guidelines achieved a sensitivity, specificity, PPV and NPV of 94.1%, 31.8%, 51.6% and 87.5%, respectively. In an alternate case scenario analysis, estimating a PAH prevalence of 10%, the proposed algorithm achieved a sensitivity, specificity, PPV and NPV for PAH of 94.1%, 54.5%, 18.7% and 98.8%, respectively. Conclusions The combination of NT-proBNP with PFT is a sensitive, yet simple and non-invasive, screening strategy for SSc-PAH. Patients with a positive screening result can be referred for echocardiography, and further confirmatory testing for PAH. In this way, it may be possible to shift the burden of routine screening away from echocardiography. The findings of this study should be confirmed in larger studies. PMID:24246100

  5. 75 FR 23749 - Combined Notice of Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-04

    ... amended negotiated rate agreement between MRT and LER. Filed Date: 04/26/2010. Accession Number: 20100426... between MRT and CES. Filed Date: 04/26/2010. Accession Number: 20100426-0212. Comment Date: 5 p.m. Eastern...

  6. Failure and Redemption of Multifilter Rotating Shadowband Radiometer (MFRSR)/Normal Incidence Multifilter Radiometer (NIMFR) Cloud Screening: Contrasting Algorithm Performance at Atmospheric Radiation Measurement (ARM) North Slope of Alaska (NSA) and Southern Great Plains (SGP) Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kassianov, Evgueni I.; Flynn, Connor J.; Koontz, Annette S.

    2013-09-11

    Well-known cloud-screening algorithms, which are designed to remove cloud-contaminated aerosol optical depths (AOD) from AOD measurements, have shown great performance at many middle-to-low latitude sites around the world. However, they may occasionally fail under challenging observational conditions, such as when the sun is low (near the horizon) or when optically thin clouds with small spatial inhomogeneity occur. Such conditions have been observed quite frequently at the high-latitude Atmospheric Radiation Measurement (ARM) North Slope of Alaska (NSA) sites. A slightly modified cloud-screening version of the standard algorithm is proposed here with a focus on the ARM-supported Multifilter Rotating Shadowband Radiometer (MFRSR)more » and Normal Incidence Multifilter Radiometer (NIMFR) data. The modified version uses approximately the same techniques as the standard algorithm, but it additionally examines the magnitude of the slant-path line of sight transmittance and eliminates points when the observed magnitude is below a specified threshold. Substantial improvement of the multi-year (1999-2012) aerosol product (AOD and its Angstrom exponent) is shown for the NSA sites when the modified version is applied. Moreover, this version reproduces the AOD product at the ARM Southern Great Plains (SGP) site, which was originally generated by the standard cloud-screening algorithms. The proposed minor modification is easy to implement and its application to existing and future cloud-screening algorithms can be particularly beneficial for challenging observational conditions.« less

  7. Breast cancer screening in the era of density notification legislation: summary of 2014 Massachusetts experience and suggestion of an evidence-based management algorithm by multi-disciplinary expert panel.

    PubMed

    Freer, Phoebe E; Slanetz, Priscilla J; Haas, Jennifer S; Tung, Nadine M; Hughes, Kevin S; Armstrong, Katrina; Semine, A Alan; Troyan, Susan L; Birdwell, Robyn L

    2015-09-01

    Stemming from breast density notification legislation in Massachusetts effective 2015, we sought to develop a collaborative evidence-based approach to density notification that could be used by practitioners across the state. Our goal was to develop an evidence-based consensus management algorithm to help patients and health care providers follow best practices to implement a coordinated, evidence-based, cost-effective, sustainable practice and to standardize care in recommendations for supplemental screening. We formed the Massachusetts Breast Risk Education and Assessment Task Force (MA-BREAST) a multi-institutional, multi-disciplinary panel of expert radiologists, surgeons, primary care physicians, and oncologists to develop a collaborative approach to density notification legislation. Using evidence-based data from the Institute for Clinical and Economic Review, the Cochrane review, National Comprehensive Cancer Network guidelines, American Cancer Society recommendations, and American College of Radiology appropriateness criteria, the group collaboratively developed an evidence-based best-practices algorithm. The expert consensus algorithm uses breast density as one element in the risk stratification to determine the need for supplemental screening. Women with dense breasts and otherwise low risk (<15% lifetime risk), do not routinely require supplemental screening per the expert consensus. Women of high risk (>20% lifetime) should consider supplemental screening MRI in addition to routine mammography regardless of breast density. We report the development of the multi-disciplinary collaborative approach to density notification. We propose a risk stratification algorithm to assess personal level of risk to determine the need for supplemental screening for an individual woman.

  8. Individualised risk assessment for diabetic retinopathy and optimisation of screening intervals: a scientific approach to reducing healthcare costs.

    PubMed

    Lund, S H; Aspelund, T; Kirby, P; Russell, G; Einarsson, S; Palsson, O; Stefánsson, E

    2016-05-01

    To validate a mathematical algorithm that calculates risk of diabetic retinopathy progression in a diabetic population with UK staging (R0-3; M1) of diabetic retinopathy. To establish the utility of the algorithm to reduce screening frequency in this cohort, while maintaining safety standards. The cohort of 9690 diabetic individuals in England, followed for 2 years. The algorithms calculated individual risk for development of preproliferative retinopathy (R2), active proliferative retinopathy (R3A) and diabetic maculopathy (M1) based on clinical data. Screening intervals were determined such that the increase in risk of developing certain stages of retinopathy between screenings was the same for all patients and identical to mean risk in fixed annual screening. Receiver operating characteristic curves were drawn and area under the curve calculated to estimate the prediction capability. The algorithm predicts the occurrence of the given diabetic retinopathy stages with area under the curve =80% for patients with type II diabetes (CI 0.78 to 0.81). Of the cohort 64% is at less than 5% risk of progression to R2, R3A or M1 within 2 years. By applying a 2 year ceiling to the screening interval, patients with type II diabetes are screened on average every 20 months, which is a 40% reduction in frequency compared with annual screening. The algorithm reliably identifies patients at high risk of developing advanced stages of diabetic retinopathy, including preproliferative R2, active proliferative R3A and maculopathy M1. Majority of patients have less than 5% risk of progression between stages within a year and a small high-risk group is identified. Screening visit frequency and presumably costs in a diabetic retinopathy screening system can be reduced by 40% by using a 2 year ceiling. Individualised risk assessment with 2 year ceiling on screening intervals may be a pragmatic next step in diabetic retinopathy screening in UK, in that safety is maximised and cost reduced by about 40%. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Orbiting Carbon Observatory-2 (OCO-2) cloud screening algorithms; validation against collocated MODIS and CALIOP data

    NASA Astrophysics Data System (ADS)

    Taylor, T. E.; O'Dell, C. W.; Frankenberg, C.; Partain, P.; Cronk, H. Q.; Savtchenko, A.; Nelson, R. R.; Rosenthal, E. J.; Chang, A. Y.; Fisher, B.; Osterman, G.; Pollock, R. H.; Crisp, D.; Eldering, A.; Gunson, M. R.

    2015-12-01

    The objective of the National Aeronautics and Space Administration's (NASA) Orbiting Carbon Observatory-2 (OCO-2) mission is to retrieve the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2) from satellite measurements of reflected sunlight in the near-infrared. These estimates can be biased by clouds and aerosols within the instrument's field of view (FOV). Screening of the most contaminated soundings minimizes unnecessary calls to the computationally expensive Level 2 (L2) XCO2 retrieval algorithm. Hence, robust cloud screening methods have been an important focus of the OCO-2 algorithm development team. Two distinct, computationally inexpensive cloud screening algorithms have been developed for this application. The A-Band Preprocessor (ABP) retrieves the surface pressure using measurements in the 0.76 μm O2 A-band, neglecting scattering by clouds and aerosols, which introduce photon path-length (PPL) differences that can cause large deviations between the expected and retrieved surface pressure. The Iterative Maximum A-Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) Preprocessor (IDP) retrieves independent estimates of the CO2 and H2O column abundances using observations taken at 1.61 μm (weak CO2 band) and 2.06 μm (strong CO2 band), while neglecting atmospheric scattering. The CO2 and H2O column abundances retrieved in these two spectral regions differ significantly in the presence of cloud and scattering aerosols. The combination of these two algorithms, which key off of different features in the spectra, provides the basis for cloud screening of the OCO-2 data set. To validate the OCO-2 cloud screening approach, collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, were compared to results from the two OCO-2 cloud screening algorithms. With tuning to allow throughputs of ≃ 30 %, agreement between the OCO-2 and MODIS cloud screening methods is found to be ≃ 85 % over four 16-day orbit repeat cycles in both the winter (December) and spring (April-May) for OCO-2 nadir-land, glint-land and glint-water observations. No major, systematic, spatial or temporal dependencies were found, although slight differences in the seasonal data sets do exist and validation is more problematic with increasing solar zenith angle and when surfaces are covered in snow and ice and have complex topography. To further analyze the performance of the cloud screening algorithms, an initial comparison of OCO-2 observations was made to collocated measurements from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) aboard the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO). These comparisons highlight the strength of the OCO-2 cloud screening algorithms in identifying high, thin clouds but suggest some difficulty in identifying some clouds near the surface, even when the optical thicknesses are greater than 1.

  10. Computer-aided diagnosis workstation and network system for chest diagnosis based on multislice CT images

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru

    2008-03-01

    Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.

  11. Comparing yield and relative costs of WHO TB screening algorithms in selected risk groups among people aged 65 years and over in China, 2013

    PubMed Central

    Cheng, Jun; Zhao, Fei; Xia, Yinyin; Zhang, Hui; Wilkinson, Ewan; Das, Mrinalini; Li, Jie; Chen, Wei; Hu, Dongmei; Jeyashree, Kathiresan; Wang, Lixia

    2017-01-01

    Objective To calculate the yield and cost per diagnosed tuberculosis (TB) case for three World Health Organization screening algorithms and one using the Chinese National TB program (NTP) TB suspect definitions, using data from a TB prevalence survey of people aged 65 years and over in China, 2013. Methods This was an analytic study using data from the above survey. Risk groups were defined and the prevalence of new TB cases in each group calculated. Costs of each screening component were used to give indicative costs per case detected. Yield, number needed to screen (NNS) and cost per case were used to assess the algorithms. Findings The prevalence survey identified 172 new TB cases in 34,250 participants. Prevalence varied greatly in different groups, from 131/100,000 to 4651/ 100,000. Two groups were chosen to compare the algorithms. The medium-risk group (living in a rural area: men, or previous TB case, or close contact or a BMI <18.5, or tobacco user) had appreciably higher cost per case (USD 221, 298 and 963) in the three algorithms than the high-risk group (all previous TB cases, all close contacts). (USD 72, 108 and 309) but detected two to four times more TB cases in the population. Using a Chest x-ray as the initial screening tool in the medium risk group cost the most (USD 963), and detected 67% of all the new cases. Using the NTP definition of TB suspects made little difference. Conclusions To “End TB”, many more TB cases have to be identified. Screening only the highest risk groups identified under 14% of the undetected cases,. To “End TB”, medium risk groups will need to be screened. Using a CXR for initial screening results in a much higher yield, at what should be an acceptable cost. PMID:28594824

  12. Effects of image compression and degradation on an automatic diabetic retinopathy screening algorithm

    NASA Astrophysics Data System (ADS)

    Agurto, C.; Barriga, S.; Murray, V.; Pattichis, M.; Soliz, P.

    2010-03-01

    Diabetic retinopathy (DR) is one of the leading causes of blindness among adult Americans. Automatic methods for detection of the disease have been developed in recent years, most of them addressing the segmentation of bright and red lesions. In this paper we present an automatic DR screening system that does approach the problem through the segmentation of features. The algorithm determines non-diseased retinal images from those with pathology based on textural features obtained using multiscale Amplitude Modulation-Frequency Modulation (AM-FM) decompositions. The decomposition is represented as features that are the inputs to a classifier. The algorithm achieves 0.88 area under the ROC curve (AROC) for a set of 280 images from the MESSIDOR database. The algorithm is then used to analyze the effects of image compression and degradation, which will be present in most actual clinical or screening environments. Results show that the algorithm is insensitive to illumination variations, but high rates of compression and large blurring effects degrade its performance.

  13. A method of measuring and correcting tilt of anti - vibration wind turbines based on screening algorithm

    NASA Astrophysics Data System (ADS)

    Xiao, Zhongxiu

    2018-04-01

    A Method of Measuring and Correcting Tilt of Anti - vibration Wind Turbines Based on Screening Algorithm is proposed in this paper. First of all, we design a device which the core is the acceleration sensor ADXL203, the inclination is measured by installing it on the tower of the wind turbine as well as the engine room. Next using the Kalman filter algorithm to filter effectively by establishing a state space model for signal and noise. Then we use matlab for simulation. Considering the impact of the tower and nacelle vibration on the collected data, the original data and the filtering data are classified and stored by the Screening algorithm, then filter the filtering data to make the output data more accurate. Finally, we eliminate installation errors by using algorithm to achieve the tilt correction. The device based on this method has high precision, low cost and anti-vibration advantages. It has a wide range of application and promotion value.

  14. A Screen Space GPGPU Surface LIC Algorithm for Distributed Memory Data Parallel Sort Last Rendering Infrastructures

    NASA Astrophysics Data System (ADS)

    Loring, B.; Karimabadi, H.; Rortershteyn, V.

    2015-10-01

    The surface line integral convolution(LIC) visualization technique produces dense visualization of vector fields on arbitrary surfaces. We present a screen space surface LIC algorithm for use in distributed memory data parallel sort last rendering infrastructures. The motivations for our work are to support analysis of datasets that are too large to fit in the main memory of a single computer and compatibility with prevalent parallel scientific visualization tools such as ParaView and VisIt. By working in screen space using OpenGL we can leverage the computational power of GPUs when they are available and run without them when they are not. We address efficiency and performance issues that arise from the transformation of data from physical to screen space by selecting an alternate screen space domain decomposition. We analyze the algorithm's scaling behavior with and without GPUs on two high performance computing systems using data from turbulent plasma simulations.

  15. A Screen Space GPGPU Surface LIC Algorithm for Distributed Memory Data Parallel Sort Last Rendering Infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loring, Burlen; Karimabadi, Homa; Rortershteyn, Vadim

    2014-07-01

    The surface line integral convolution(LIC) visualization technique produces dense visualization of vector fields on arbitrary surfaces. We present a screen space surface LIC algorithm for use in distributed memory data parallel sort last rendering infrastructures. The motivations for our work are to support analysis of datasets that are too large to fit in the main memory of a single computer and compatibility with prevalent parallel scientific visualization tools such as ParaView and VisIt. By working in screen space using OpenGL we can leverage the computational power of GPUs when they are available and run without them when they are not.more » We address efficiency and performance issues that arise from the transformation of data from physical to screen space by selecting an alternate screen space domain decomposition. We analyze the algorithm's scaling behavior with and without GPUs on two high performance computing systems using data from turbulent plasma simulations.« less

  16. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    PubMed

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  17. Centroids evaluation of the images obtained with the conical null-screen corneal topographer

    NASA Astrophysics Data System (ADS)

    Osorio-Infante, Arturo I.; Armengol-Cruz, Victor de Emanuel; Campos-García, Manuel; Cossio-Guerrero, Cesar; Marquez-Flores, Jorge; Díaz-Uribe, José Rufino

    2016-09-01

    In this work, we propose some algorithms to recover the centroids of the resultant image obtained by a conical nullscreen based corneal topographer. With these algorithms, we obtain the region of interest (roi) of the original image and using an image-processing algorithm, we calculate the geometric centroid of each roi. In order to improve our algorithm performance, we use different settings of null-screen targets, changing their size and number. We also improved the illumination system to avoid inhomogeneous zones in the corneal images. Finally, we report some corneal topographic measurements with the best setting we found.

  18. Comparison of Two Phenotypic Algorithms To Detect Carbapenemase-Producing Enterobacteriaceae

    PubMed Central

    Dortet, Laurent; Bernabeu, Sandrine; Gonzalez, Camille

    2017-01-01

    ABSTRACT A novel algorithm designed for the screening of carbapenemase-producing Enterobacteriaceae (CPE), based on faropenem and temocillin disks, was compared to that of the Committee of the Antibiogram of the French Society of Microbiology (CA-SFM), which is based on ticarcillin-clavulanate, imipenem, and temocillin disks. The two algorithms presented comparable negative predictive values (98.6% versus 97.5%) for CPE screening among carbapenem-nonsusceptible Enterobacteriaceae. However, since 46.2% (n = 49) of the CPE were correctly identified as OXA-48-like producers by the faropenem/temocillin-based algorithm, it significantly decreased the number of complementary tests needed (42.2% versus 62.6% with the CA-SFM algorithm). PMID:28607010

  19. Pattern optimizing verification of self-align quadruple patterning

    NASA Astrophysics Data System (ADS)

    Yamato, Masatoshi; Yamada, Kazuki; Oyama, Kenichi; Hara, Arisa; Natori, Sakurako; Yamauchi, Shouhei; Koike, Kyohei; Yaegashi, Hidetami

    2017-03-01

    Lithographic scaling continues to advance by extending the life of 193nm immersion technology, and spacer-type multi-patterning is undeniably the driving force behind this trend. Multi-patterning techniques such as self-aligned double patterning (SADP) and self-aligned quadruple patterning (SAQP) have come to be used in memory devices, and they have also been adopted in logic devices to create constituent patterns in the formation of 1D layout designs. Multi-patterning has consequently become an indispensible technology in the fabrication of all advanced devices. In general, items that must be managed when using multi-patterning include critical dimension uniformity (CDU), line edge roughness (LER), and line width roughness (LWR). Recently, moreover, there has been increasing focus on judging and managing pattern resolution performance from a more detailed perspective and on making a right/wrong judgment from the perspective of edge placement error (EPE). To begin with, pattern resolution performance in spacer-type multi-patterning is affected by the process accuracy of the core (mandrel) pattern. Improving the controllability of CD and LER of the mandrel is most important, and to reduce LER, an appropriate smoothing technique should be carefully selected. In addition, the atomic layer deposition (ALD) technique is generally used to meet the need for high accuracy in forming the spacer film. Advances in scaling are accompanied by stricter requirements in the controllability of fine processing. In this paper, we first describe our efforts in improving controllability by selecting the most appropriate materials for the mandrel pattern and spacer film. Then, based on the materials selected, we present experimental results on a technique for improving etching selectivity.

  20. Chromosome-level assembly of Arabidopsis thaliana Ler reveals the extent of translocation and inversion polymorphisms.

    PubMed

    Zapata, Luis; Ding, Jia; Willing, Eva-Maria; Hartwig, Benjamin; Bezdan, Daniela; Jiao, Wen-Biao; Patel, Vipul; Velikkakam James, Geo; Koornneef, Maarten; Ossowski, Stephan; Schneeberger, Korbinian

    2016-07-12

    Resequencing or reference-based assemblies reveal large parts of the small-scale sequence variation. However, they typically fail to separate such local variation into colinear and rearranged variation, because they usually do not recover the complement of large-scale rearrangements, including transpositions and inversions. Besides the availability of hundreds of genomes of diverse Arabidopsis thaliana accessions, there is so far only one full-length assembled genome: the reference sequence. We have assembled 117 Mb of the A. thaliana Landsberg erecta (Ler) genome into five chromosome-equivalent sequences using a combination of short Illumina reads, long PacBio reads, and linkage information. Whole-genome comparison against the reference sequence revealed 564 transpositions and 47 inversions comprising ∼3.6 Mb, in addition to 4.1 Mb of nonreference sequence, mostly originating from duplications. Although rearranged regions are not different in local divergence from colinear regions, they are drastically depleted for meiotic recombination in heterozygotes. Using a 1.2-Mb inversion as an example, we show that such rearrangement-mediated reduction of meiotic recombination can lead to genetically isolated haplotypes in the worldwide population of A. thaliana Moreover, we found 105 single-copy genes, which were only present in the reference sequence or the Ler assembly, and 334 single-copy orthologs, which showed an additional copy in only one of the genomes. To our knowledge, this work gives first insights into the degree and type of variation, which will be revealed once complete assemblies will replace resequencing or other reference-dependent methods.

  1. Using Neural Networks to Improve the Performance of Radiative Transfer Modeling Used for Geometry Dependent LER Calculations

    NASA Astrophysics Data System (ADS)

    Fasnacht, Z.; Qin, W.; Haffner, D. P.; Loyola, D. G.; Joiner, J.; Krotkov, N. A.; Vasilkov, A. P.; Spurr, R. J. D.

    2017-12-01

    In order to estimate surface reflectance used in trace gas retrieval algorithms, radiative transfer models (RTM) such as the Vector Linearized Discrete Ordinate Radiative Transfer Model (VLIDORT) can be used to simulate the top of the atmosphere (TOA) radiances with advanced models of surface properties. With large volumes of satellite data, these model simulations can become computationally expensive. Look up table interpolation can improve the computational cost of the calculations, but the non-linear nature of the radiances requires a dense node structure if interpolation errors are to be minimized. In order to reduce our computational effort and improve the performance of look-up tables, neural networks can be trained to predict these radiances. We investigate the impact of using look-up table interpolation versus a neural network trained using the smart sampling technique, and show that neural networks can speed up calculations and reduce errors while using significantly less memory and RTM calls. In future work we will implement a neural network in operational processing to meet growing demands for reflectance modeling in support of high spatial resolution satellite missions.

  2. Computer-aided diagnosis workstation and network system for chest diagnosis based on multislice CT images

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru

    2007-03-01

    Multislice CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multislice CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. Moreover, we have provided diagnostic assistance methods to medical screening specialists by using a lung cancer screening algorithm built into mobile helical CT scanner for the lung cancer mass screening done in the region without the hospital. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system.

  3. Validation of air traffic controller workload models

    DOT National Transportation Integrated Search

    1979-09-01

    During the past several years, computer models have been developed for off-site : estimat ion of control ler's workload. The inputs to these models are audio and : digital data normally recorded at an Air Route Traffic Control Center (ARTCC). : This ...

  4. Disparities in the treatment and outcomes of vascular disease in Hispanic patients

    PubMed Central

    Morrissey, Nicholas J.; Giacovelli, Jeannine; Egorova, Natalia; Gelijns, Annetine; Moskowitz, Alan; McKinsey, James; Kent, Kenneth Craig; Greco, Giampaolo

    2008-01-01

    Background The Hispanic population represents the fastest growing minority in the United States. As the population grows and ages, the vascular surgery community will be providing increasing amounts of care to this diverse group. To appropriately administer preventive and therapeutic care, it is important to understand the incidence, risk factors, and natural history of vascular disease in Hispanic patients. Methods We analyzed hospital discharge databases from New York and Florida to determine the rate of lower extremity revascularization (LER), carotid revascularization (CR), and abdominal aortic aneurysm (AAA) repair in Hispanics relative to the general population. The rates of common comorbidities, the indications for the procedures, and outcomes during the same hospitalization as the index procedure were determined. Multivariate logistic regression analysis was used to determine the differences between Hispanics and white non-Hispanics with respect to rate of procedure, symptoms at presentation, and outcome after procedure. Demographic variables and length of stay were also analyzed. Results The rate of LER, CR, and AAA repair was significantly lower in Hispanic patients than in white non-Hispanics. Despite this lower rate of intervention, Hispanics were significantly more likely than whites to present with limb-threatening lower extremity ischemia (odds ratio [OR], 2.09; 95% confidence interval [CI], 1.91 to 2.29), symptomatic carotid artery disease (OR, 1.57; 95% CI, 1.4 to 1.75), and ruptured AAA (OR, 1.26; 95% CI, 1.04-1.52) than white non-Hispanics These differences were maintained after controlling for the presence of diabetes mellitus and other comorbidities. Hispanic patients had higher rates of amputation during the same hospitalization after LER (6.2% vs 3.4%, P < .0001) and higher mortality after elective AAA repair (5% vs 3.4%, P = .0032). Length of stay after LER, CR, and AAA repair was longer for Hispanic patients than white non-Hispanics. Conclusion Significant disparities in the rate of utilization of three common vascular surgical procedures exist between Hispanic patients and the general population. In addition, Hispanics appear to present with more advanced disease and have worse outcomes in some cases. Reasons for these disparities must be determined to improve these results in the fastest growing segment of our society. PMID:17980283

  5. Point-Counterpoint: Cervical Cancer Screening Should Be Done by Primary Human Papillomavirus Testing with Genotyping and Reflex Cytology for Women over the Age of 25 Years

    PubMed Central

    Zhao, Chengquan

    2015-01-01

    Screening for cervical cancer with cytology testing has been very effective in reducing cervical cancer in the United States. For decades, the approach was an annual Pap test. In 2000, the Hybrid Capture 2 human papillomavirus (HPV) test was approved by the U.S. Food and Drug Administration (FDA) for screening women who have atypical squamous cells of underdetermined significance (ASCUS) detected by Pap test to determine the need for colposcopy. In 2003, the FDA approved expanding the use of the test to include screening performed in conjunction with a Pap test for women over the age of 30 years, referred to as “cotesting.” Cotesting allows women to extend the testing interval to 3 years if both tests have negative results. In April of 2014, the FDA approved the use of an HPV test (the cobas HPV test) for primary cervical cancer screening for women over the age of 25 years, without the need for a concomitant Pap test. The approval recommended either colposcopy or a Pap test for patients with specific high-risk HPV types detected by the HPV test. This was based on the results of the ATHENA trial, which included more than 40,000 women. Reaction to this decision has been mixed. Supporters point to the fact that the primary-screening algorithm found more disease (cervical intraepithelial neoplasia 3 or worse [CIN3+]) and also found it earlier than did cytology or cotesting. Moreover, the positive predictive value and positive-likelihood ratio of the primary-screening algorithm were higher than those of cytology. Opponents of the decision prefer cotesting, as this approach detects more disease than the HPV test alone. In addition, the performance of this new algorithm has not been assessed in routine clinical use. Professional organizations will need to develop guidelines that incorporate this testing algorithm. In this Point-Counterpoint, Dr. Stoler explains why he favors the primary-screening algorithm, while Drs. Austin and Zhao explain why they prefer the cotesting approach to screening for cervical cancer. PMID:25948606

  6. An intelligent identification algorithm for the monoclonal picking instrument

    NASA Astrophysics Data System (ADS)

    Yan, Hua; Zhang, Rongfu; Yuan, Xujun; Wang, Qun

    2017-11-01

    The traditional colony selection is mainly operated by manual mode, which takes on low efficiency and strong subjectivity. Therefore, it is important to develop an automatic monoclonal-picking instrument. The critical stage of the automatic monoclonal-picking and intelligent optimal selection is intelligent identification algorithm. An auto-screening algorithm based on Support Vector Machine (SVM) is proposed in this paper, which uses the supervised learning method, which combined with the colony morphological characteristics to classify the colony accurately. Furthermore, through the basic morphological features of the colony, system can figure out a series of morphological parameters step by step. Through the establishment of maximal margin classifier, and based on the analysis of the growth trend of the colony, the selection of the monoclonal colony was carried out. The experimental results showed that the auto-screening algorithm could screen out the regular colony from the other, which meets the requirement of various parameters.

  7. Improving Electronic Sensor Reliability by Robust Outlier Screening

    PubMed Central

    Moreno-Lizaranzu, Manuel J.; Cuesta, Federico

    2013-01-01

    Electronic sensors are widely used in different application areas, and in some of them, such as automotive or medical equipment, they must perform with an extremely low defect rate. Increasing reliability is paramount. Outlier detection algorithms are a key component in screening latent defects and decreasing the number of customer quality incidents (CQIs). This paper focuses on new spatial algorithms (Good Die in a Bad Cluster with Statistical Bins (GDBC SB) and Bad Bin in a Bad Cluster (BBBC)) and an advanced outlier screening method, called Robust Dynamic Part Averaging Testing (RDPAT), as well as two practical improvements, which significantly enhance existing algorithms. Those methods have been used in production in Freescale® Semiconductor probe factories around the world for several years. Moreover, a study was conducted with production data of 289,080 dice with 26 CQIs to determine and compare the efficiency and effectiveness of all these algorithms in identifying CQIs. PMID:24113682

  8. Improving electronic sensor reliability by robust outlier screening.

    PubMed

    Moreno-Lizaranzu, Manuel J; Cuesta, Federico

    2013-10-09

    Electronic sensors are widely used in different application areas, and in some of them, such as automotive or medical equipment, they must perform with an extremely low defect rate. Increasing reliability is paramount. Outlier detection algorithms are a key component in screening latent defects and decreasing the number of customer quality incidents (CQIs). This paper focuses on new spatial algorithms (Good Die in a Bad Cluster with Statistical Bins (GDBC SB) and Bad Bin in a Bad Cluster (BBBC)) and an advanced outlier screening method, called Robust Dynamic Part Averaging Testing (RDPAT), as well as two practical improvements, which significantly enhance existing algorithms. Those methods have been used in production in Freescale® Semiconductor probe factories around the world for several years. Moreover, a study was conducted with production data of 289,080 dice with 26 CQIs to determine and compare the efficiency and effectiveness of all these algorithms in identifying CQIs.

  9. Systemic inflammatory response syndrome-based severe sepsis screening algorithms in emergency department patients with suspected sepsis.

    PubMed

    Shetty, Amith L; Brown, Tristam; Booth, Tarra; Van, Kim Linh; Dor-Shiffer, Daphna E; Vaghasiya, Milan R; Eccleston, Cassanne E; Iredell, Jonathan

    2016-06-01

    Systemic inflammatory response syndrome (SIRS)-based severe sepsis screening algorithms have been utilised in stratification and initiation of early broad spectrum antibiotics for patients presenting to EDs with suspected sepsis. We aimed to investigate the performance of some of these algorithms on a cohort of suspected sepsis patients. We conducted a retrospective analysis on an ED-based prospective sepsis registry at a tertiary Sydney hospital, Australia. Definitions for sepsis were based on the 2012 Surviving Sepsis Campaign guidelines. Numerical values for SIRS criteria and ED investigation results were recorded at the trigger of sepsis pathway on the registry. Performance of specific SIRS-based screening algorithms at sites from USA, Canada, UK, Australia and Ireland health institutions were investigated. Severe sepsis screening algorithms' performance was measured on 747 patients presenting with suspected sepsis (401 with severe sepsis, prevalence 53.7%). Sensitivity and specificity of algorithms to flag severe sepsis ranged from 20.2% (95% CI 16.4-24.5%) to 82.3% (95% CI 78.2-85.9%) and 57.8% (95% CI 52.4-63.1%) to 94.8% (95% CI 91.9-96.9%), respectively. Variations in SIRS values between uncomplicated and severe sepsis cohorts were only minor, except a higher mean lactate (>1.6 mmol/L, P < 0.01). We found the Ireland and JFK Medical Center sepsis algorithms performed modestly in stratifying suspected sepsis patients into high-risk groups. Algorithms with lactate levels thresholds of >2 mmol/L rather than >4 mmol/L performed better. ED sepsis registry-based characterisation of patients may help further refine sepsis definitions of the future. © 2016 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  10. Resonant-enhanced full-color emission of quantum-dot-based micro LED display technology.

    PubMed

    Han, Hau-Vei; Lin, Huang-Yu; Lin, Chien-Chung; Chong, Wing-Cheung; Li, Jie-Ru; Chen, Kuo-Ju; Yu, Peichen; Chen, Teng-Ming; Chen, Huang-Ming; Lau, Kei-May; Kuo, Hao-Chung

    2015-12-14

    Colloidal quantum dots which can emit red, green, and blue colors are incorporated with a micro-LED array to demonstrate a feasible choice for future display technology. The pitch of the micro-LED array is 40 μm, which is sufficient for high-resolution screen applications. The method that was used to spray the quantum dots in such tight space is called Aerosol Jet technology which uses atomizer and gas flow control to obtain uniform and controlled narrow spots. The ultra-violet LEDs are used in the array to excite the red, green and blue quantum dots on the top surface. To increase the utilization of the UV photons, a layer of distributed Bragg reflector was laid down on the device to reflect most of the leaked UV photons back to the quantum dot layers. With this mechanism, the enhanced luminous flux is 194% (blue), 173% (green) and 183% (red) more than that of the samples without the reflector. The luminous efficacy of radiation (LER) was measured under various currents and a value of 165 lm/Watt was recorded.

  11. Computer-aided diagnosis workstation and telemedicine network system for chest diagnosis based on multislice CT images

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki

    2009-02-01

    Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.

  12. Discordant human T-lymphotropic virus screening with Western blot confirmation: evaluation of the dual-test algorithm for US blood donations.

    PubMed

    Stramer, Susan L; Townsend, Rebecca L; Foster, Gregory A; Johnson, Ramona; Weixlmann, Barbara; Dodd, Roger Y

    2018-03-01

    Human T-lymphotropic virus (HTLV) blood donation screening has used a dual-testing algorithm beginning with either a chemiluminescent immunoassay or enzyme-linked immunosorbent screening assay (ELISA). Before the availability of a licensed HTLV supplemental assay, repeat-reactive (RR) samples on a first assay (Assay 1) were retested with a second screening assay (Assay 2). Donors with RR results by Assay 2 were deferred from blood donation and further tested using an unlicensed supplemental test to confirm reactivity while nonreactive (NR) donors remained eligible for donation until RR on a subsequent donation. This "dual-test" algorithm was replaced in May 2016 with the requirement that all RRs by Assay 1 be further tested by a licensed HTLV supplemental test (Western blot [WB]). In this study, we have requalified the dual-test algorithm using the available licensed HTLV WB. We tested 100 randomly selected HTLV RRs on screening Assay 1 (Abbott PRISM chemiluminescent immunoassay) but NR on screening Assay 2 (Avioq ELISA) by a Food and Drug Administration-licensed WB (MP Biomedicals) to ensure that no confirmed positives were among those that were RR by Assay 1 but NR by Assay 2. Of the 100 samples evaluated, 79 of 100 were WB seronegative, 21 of 100 indeterminate, and 0 of 100 seropositive. Of the 79 of 100 seronegative specimens, 73 of 79 did not express any bands on WB. We demonstrated that none of the 100 samples RR on Assay 1 but NR on Assay 2 were confirmed positive. This algorithm prevents such donors from requiring further testing and from being deferred. © 2018 AABB.

  13. Breast Cancer Screening in the Era of Density Notification Legislation: Summary of 2014 Massachusetts Experience and Suggestion of An Evidence-Based Management Algorithm by Multi-disciplinary Expert Panel

    PubMed Central

    Freer, Phoebe E.; Slanetz, Priscilla J.; Haas, Jennifer S.; Tung, Nadine M.; Hughes, Kevin S.; Armstrong, Katrina; Semine, A. Alan; Troyan, Susan L.; Birdwell, Robyn L.

    2015-01-01

    Purpose Stemming from breast density notification legislation in Massachusetts effective 2015, we sought to develop a collaborative evidence-based approach to density notification that could be used by practitioners across the state. Our goal was to develop an evidence-based consensus management algorithm to help patients and health care providers follow best practices to implement a coordinated, evidence-based, cost-effective, sustainable practice and to standardize care in recommendations for supplemental screening. Methods We formed the Massachusetts Breast Risk Education and Assessment Task Force (MA-BREAST) a multi-institutional, multi-disciplinary panel of expert radiologists, surgeons, primary care physicians, and oncologists to develop a collaborative approach to density notification legislation. Using evidence-based data from the Institute for Clinical and Economic Review (ICER), the Cochrane review, National Comprehensive Cancer Network (NCCN) guidelines, American Cancer Society (ACS) recommendations, and American College of Radiology (ACR) appropriateness criteria, the group collaboratively developed an evidence-based best-practices algorithm. Results The expert consensus algorithm uses breast density as one element in the risk stratification to determine the need for supplemental screening. Women with dense breasts and otherwise low risk (<15% lifetime risk), do not routinely require supplemental screening per the expert consensus. Women of high risk (>20% lifetime) should consider supplemental screening MRI in addition to routine mammography regardless of breast density. Conclusion We report the development of the multi-disciplinary collaborative approach to density notification. We propose a risk stratification algorithm to assess personal level of risk to determine the need for supplemental screening for an individual woman. PMID:26290416

  14. Multi-objective vs. single-objective calibration of a hydrologic model using single- and multi-objective screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Shafii, Mahyar; Zink, Matthias; Schäfer, David; Thober, Stephan; Samaniego, Luis; Tolson, Bryan

    2016-04-01

    Hydrologic models are traditionally calibrated against observed streamflow. Recent studies have shown however, that only a few global model parameters are constrained using this kind of integral signal. They can be identified using prior screening techniques. Since different objectives might constrain different parameters, it is advisable to use multiple information to calibrate those models. One common approach is to combine these multiple objectives (MO) into one single objective (SO) function and allow the use of a SO optimization algorithm. Another strategy is to consider the different objectives separately and apply a MO Pareto optimization algorithm. In this study, two major research questions will be addressed: 1) How do multi-objective calibrations compare with corresponding single-objective calibrations? 2) How much do calibration results deteriorate when the number of calibrated parameters is reduced by a prior screening technique? The hydrologic model employed in this study is a distributed hydrologic model (mHM) with 52 model parameters, i.e. transfer coefficients. The model uses grid cells as a primary hydrologic unit, and accounts for processes like snow accumulation and melting, soil moisture dynamics, infiltration, surface runoff, evapotranspiration, subsurface storage and discharge generation. The model is applied in three distinct catchments over Europe. The SO calibrations are performed using the Dynamically Dimensioned Search (DDS) algorithm with a fixed budget while the MO calibrations are achieved using the Pareto Dynamically Dimensioned Search (PA-DDS) algorithm allowing for the same budget. The two objectives used here are the Nash Sutcliffe Efficiency (NSE) of the simulated streamflow and the NSE of the logarithmic transformation. It is shown that the SO DDS results are located close to the edges of the Pareto fronts of the PA-DDS. The MO calibrations are hence preferable due to their supply of multiple equivalent solutions from which the user can choose at the end due to the specific needs. The sequential single-objective parameter screening was employed prior to the calibrations reducing the number of parameters by at least 50% in the different catchments and for the different single objectives. The single-objective calibrations led to a faster convergence of the objectives and are hence beneficial when using a DDS on single-objectives. The above mentioned parameter screening technique is generalized for multi-objectives and applied before calibration using the PA-DDS algorithm. Two different alternatives of this MO-screening are tested. The comparison of the calibration results using all parameters and using only screened parameters shows for both alternatives that the PA-DDS algorithm does not profit in terms of trade-off size and function evaluations required to achieve converged pareto fronts. This is because the PA-DDS algorithm automatically reduces search space with progress of the calibration run. This automatic reduction should be different for other search algorithms. It is therefore hypothesized that prior screening can but must not be beneficial for parameter estimation dependent on the chosen optimization algorithm.

  15. Development and Evaluation of Algorithms for Breath Alcohol Screening.

    PubMed

    Ljungblad, Jonas; Hök, Bertil; Ekström, Mikael

    2016-04-01

    Breath alcohol screening is important for traffic safety, access control and other areas of health promotion. A family of sensor devices useful for these purposes is being developed and evaluated. This paper is focusing on algorithms for the determination of breath alcohol concentration in diluted breath samples using carbon dioxide to compensate for the dilution. The examined algorithms make use of signal averaging, weighting and personalization to reduce estimation errors. Evaluation has been performed by using data from a previously conducted human study. It is concluded that these features in combination will significantly reduce the random error compared to the signal averaging algorithm taken alone.

  16. Management algorithms for cervical cancer screening and precancer treatment for resource-limited settings.

    PubMed

    Basu, Partha; Meheus, Filip; Chami, Youssef; Hariprasad, Roopa; Zhao, Fanghui; Sankaranarayanan, Rengaswamy

    2017-07-01

    Management algorithms for screen-positive women in cervical cancer prevention programs have undergone substantial changes in recent years. The WHO strongly recommends human papillomavirus (HPV) testing for primary screening, if affordable, or if not, then visual inspection with acetic acid (VIA), and promotes treatment directly following screening through the screen-and-treat approach (one or two clinic visits). While VIA-positive women can be offered immediate ablative treatment based on certain eligibility criteria, HPV-positive women need to undergo subsequent VIA to determine their eligibility. Simpler ablative methods of treatment such as cryotherapy and thermal coagulation have been demonstrated to be effective and to have excellent safety profiles, and these have become integral parts of new management algorithms. The challenges faced by low-resource countries are many and include, from the management perspective, identifying an affordable point-of-care HPV detection test, minimizing over-treatment, and installing an effective information system to ensure high compliance to treatment and follow-up. © 2017 The Authors. International Journal of Gynecology & Obstetrics published by John Wiley & Sons Ltd on behalf of International Federation of Gynecology and Obstetrics.

  17. Developing a Screening Algorithm for Type II Diabetes Mellitus in the Resource-Limited Setting of Rural Tanzania.

    PubMed

    West, Caroline; Ploth, David; Fonner, Virginia; Mbwambo, Jessie; Fredrick, Francis; Sweat, Michael

    2016-04-01

    Noncommunicable diseases are on pace to outnumber infectious disease as the leading cause of death in sub-Saharan Africa, yet many questions remain unanswered with concern toward effective methods of screening for type II diabetes mellitus (DM) in this resource-limited setting. We aim to design a screening algorithm for type II DM that optimizes sensitivity and specificity of identifying individuals with undiagnosed DM, as well as affordability to health systems and individuals. Baseline demographic and clinical data, including hemoglobin A1c (HbA1c), were collected from 713 participants using probability sampling of the general population. We used these data, along with model parameters obtained from the literature, to mathematically model 8 purposed DM screening algorithms, while optimizing the sensitivity and specificity using Monte Carlo and Latin Hypercube simulation. An algorithm that combines risk assessment and measurement of fasting blood glucose was found to be superior for the most resource-limited settings (sensitivity 68%, sensitivity 99% and cost per patient having DM identified as $2.94). Incorporating HbA1c testing improves the sensitivity to 75.62%, but raises the cost per DM case identified to $6.04. The preferred algorithms are heavily biased to diagnose those with more severe cases of DM. Using basic risk assessment tools and fasting blood sugar testing in lieu of HbA1c testing in resource-limited settings could allow for significantly more feasible DM screening programs with reasonable sensitivity and specificity. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  18. A utility/cost analysis of breast cancer risk prediction algorithms

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.

    2016-03-01

    Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.

  19. Predicting Sepsis Risk Using the "Sniffer" Algorithm in the Electronic Medical Record.

    PubMed

    Olenick, Evelyn M; Zimbro, Kathie S; DʼLima, Gabrielle M; Ver Schneider, Patricia; Jones, Danielle

    The Sepsis "Sniffer" Algorithm (SSA) has merit as a digital sepsis alert but should be considered an adjunct to versus an alternative for the Nurse Screening Tool (NST), given lower specificity and positive predictive value. The SSA reduced the risk of incorrectly categorizing patients at low risk for sepsis, detected sepsis high risk in half the time, and reduced redundant NST screens by 70% and manual screening hours by 64% to 72%. Preserving nurse hours expended on manual sepsis alerts may translate into time directed toward other patient priorities.

  20. A New Pulse Pileup Rejection Method Based on Position Shift Identification

    NASA Astrophysics Data System (ADS)

    Gu, Z.; Prout, D. L.; Taschereau, R.; Bai, B.; Chatziioannou, A. F.

    2016-02-01

    Pulse pileup events degrade the signal-to-noise ratio (SNR) of nuclear medicine data. When such events occur in multiplexed detectors, they cause spatial misposition, energy spectrum distortion and degraded timing resolution, which leads to image artifacts. Pulse pileup is pronounced in PETbox4, a bench top PET scanner dedicated to high sensitivity and high resolution imaging of mice. In that system, the combination of high absolute sensitivity, long scintillator decay time (BGO) and highly multiplexed electronics lead to a significant fraction of pulse pileup, reached at lower total activity than for comparable instruments. In this manuscript, a new pulse pileup rejection method named position shift rejection (PSR) is introduced. The performance of PSR is compared with a conventional leading edge rejection (LER) method and with no pileup rejection implemented (NoPR). A comprehensive digital pulse library was developed for objective evaluation and optimization of the PSR and LER, in which pulse waveforms were directly recorded from real measurements exactly representing the signals to be processed. Physical measurements including singles event acquisition, peak system sensitivity and NEMA NU-4 image quality phantom were also performed in the PETbox4 system to validate and compare the different pulse pile-up rejection methods. The evaluation of both physical measurements and model pulse trains demonstrated that the new PSR performs more accurate pileup event identification and avoids erroneous rejection of valid events. For the PETbox4 system, this improvement leads to a significant recovery of sensitivity at low count rates, amounting to about 1/4th of the expected true coincidence events, compared to the LER method. Furthermore, with the implementation of PSR, optimal image quality can be achieved near the peak noise equivalent count rate (NECR).

  1. Coupling between corotation and Lindblad resonances in the presence of secular precession rates

    NASA Astrophysics Data System (ADS)

    El Moutamid, Maryame; Sicardy, Bruno; Renner, Stéfan

    2014-03-01

    We investigate the dynamics of two satellites with masses and orbiting a massive central planet in a common plane, near a first order mean motion resonance ( m integer). We consider only the resonant terms of first order in eccentricity in the disturbing potential of the satellites, plus the secular terms causing the orbital apsidal precessions. We obtain a two-degrees-of-freedom system, associated with the two critical resonant angles and , where and are the mean longitude and longitude of periapsis of , respectively, and where the primed quantities apply to . We consider the special case where (restricted problem). The symmetry between the two angles and is then broken, leading to two different kinds of resonances, classically referred to as corotation eccentric resonance (CER) and Lindblad eccentric Resonance (LER), respectively. We write the four reduced equations of motion near the CER and LER, that form what we call the CoraLin model. This model depends upon only two dimensionless parameters that control the dynamics of the system: the distance between the CER and LER, and a forcing parameter that includes both the mass and the orbital eccentricity of the disturbing satellite. Three regimes are found: for the system is integrable, for of order unity, it exhibits prominent chaotic regions, while for large compared to 2, the behavior of the system is regular and can be qualitatively described using simple adiabatic invariant arguments. We apply this model to three recently discovered small Saturnian satellites dynamically linked to Mimas through first order mean motion resonances: Aegaeon, Methone and Anthe. Poincaré surfaces of section reveal the dynamical structure of each orbit, and their proximity to chaotic regions. This work may be useful to explore various scenarii of resonant capture for those satellites.

  2. Unbiased roughness measurements: the key to better etch performance

    NASA Astrophysics Data System (ADS)

    Liang, Andrew; Mack, Chris; Sirard, Stephen; Liang, Chen-wei; Yang, Liu; Jiang, Justin; Shamma, Nader; Wise, Rich; Yu, Jengyi; Hymes, Diane

    2018-03-01

    Edge placement error (EPE) has become an increasingly critical metric to enable Moore's Law scaling. Stochastic variations, as characterized for lines by line width roughness (LWR) and line edge roughness (LER), are dominant factors in EPE and known to increase with the introduction of EUV lithography. However, despite recommendations from ITRS, NIST, and SEMI standards, the industry has not agreed upon a methodology to quantify these properties. Thus, differing methodologies applied to the same image often result in different roughness measurements and conclusions. To standardize LWR and LER measurements, Fractilia has developed an unbiased measurement that uses a raw unfiltered line scan to subtract out image noise and distortions. By using Fractilia's inverse linescan model (FILM) to guide development, we will highlight the key influences of roughness metrology on plasma-based resist smoothing processes. Test wafers were deposited to represent a 5 nm node EUV logic stack. The patterning stack consists of a core Si target layer with spin-on carbon (SOC) as the hardmask and spin-on glass (SOG) as the cap. Next, these wafers were exposed through an ASML NXE 3350B EUV scanner with an advanced chemically amplified resist (CAR). Afterwards, these wafers were etched through a variety of plasma-based resist smoothing techniques using a Lam Kiyo conductor etch system. Dense line and space patterns on the etched samples were imaged through advanced Hitachi CDSEMs and the LER and LWR were measured through both Fractilia and an industry standard roughness measurement software. By employing Fractilia to guide plasma-based etch development, we demonstrate that Fractilia produces accurate roughness measurements on resist in contrast to an industry standard measurement software. These results highlight the importance of subtracting out SEM image noise to obtain quicker developmental cycle times and lower target layer roughness.

  3. Tear fluid proteomics multimarkers for diabetic retinopathy screening

    PubMed Central

    2013-01-01

    Background The aim of the project was to develop a novel method for diabetic retinopathy screening based on the examination of tear fluid biomarker changes. In order to evaluate the usability of protein biomarkers for pre-screening purposes several different approaches were used, including machine learning algorithms. Methods All persons involved in the study had diabetes. Diabetic retinopathy (DR) was diagnosed by capturing 7-field fundus images, evaluated by two independent ophthalmologists. 165 eyes were examined (from 119 patients), 55 were diagnosed healthy and 110 images showed signs of DR. Tear samples were taken from all eyes and state-of-the-art nano-HPLC coupled ESI-MS/MS mass spectrometry protein identification was performed on all samples. Applicability of protein biomarkers was evaluated by six different optimally parameterized machine learning algorithms: Support Vector Machine, Recursive Partitioning, Random Forest, Naive Bayes, Logistic Regression, K-Nearest Neighbor. Results Out of the six investigated machine learning algorithms the result of Recursive Partitioning proved to be the most accurate. The performance of the system realizing the above algorithm reached 74% sensitivity and 48% specificity. Conclusions Protein biomarkers selected and classified with machine learning algorithms alone are at present not recommended for screening purposes because of low specificity and sensitivity values. This tool can be potentially used to improve the results of image processing methods as a complementary tool in automatic or semiautomatic systems. PMID:23919537

  4. Costs and consequences of automated algorithms versus manual grading for the detection of referable diabetic retinopathy.

    PubMed

    Scotland, G S; McNamee, P; Fleming, A D; Goatman, K A; Philip, S; Prescott, G J; Sharp, P F; Williams, G J; Wykes, W; Leese, G P; Olson, J A

    2010-06-01

    To assess the cost-effectiveness of an improved automated grading algorithm for diabetic retinopathy against a previously described algorithm, and in comparison with manual grading. Efficacy of the alternative algorithms was assessed using a reference graded set of images from three screening centres in Scotland (1253 cases with observable/referable retinopathy and 6333 individuals with mild or no retinopathy). Screening outcomes and grading and diagnosis costs were modelled for a cohort of 180 000 people, with prevalence of referable retinopathy at 4%. Algorithm (b), which combines image quality assessment with detection algorithms for microaneurysms (MA), blot haemorrhages and exudates, was compared with a simpler algorithm (a) (using image quality assessment and MA/dot haemorrhage (DH) detection), and the current practice of manual grading. Compared with algorithm (a), algorithm (b) would identify an additional 113 cases of referable retinopathy for an incremental cost of pound 68 per additional case. Compared with manual grading, automated grading would be expected to identify between 54 and 123 fewer referable cases, for a grading cost saving between pound 3834 and pound 1727 per case missed. Extrapolation modelling over a 20-year time horizon suggests manual grading would cost between pound 25,676 and pound 267,115 per additional quality adjusted life year gained. Algorithm (b) is more cost-effective than the algorithm based on quality assessment and MA/DH detection. With respect to the value of introducing automated detection systems into screening programmes, automated grading operates within the recommended national standards in Scotland and is likely to be considered a cost-effective alternative to manual disease/no disease grading.

  5. SCREEN: A simple layperson administered screening algorithm in low resource international settings significantly reduces waiting time for critically ill children in primary healthcare clinics.

    PubMed

    Hansoti, Bhakti; Jenson, Alexander; Kironji, Antony G; Katz, Joanne; Levin, Scott; Rothman, Richard; Kelen, Gabor D; Wallis, Lee A

    2017-01-01

    In low resource settings, an inadequate number of trained healthcare workers and high volumes of children presenting to Primary Healthcare Centers (PHC) result in prolonged waiting times and significant delays in identifying and evaluating critically ill children. The Sick Children Require Emergency Evaluation Now (SCREEN) program, a simple six-question screening algorithm administered by lay healthcare workers, was developed in 2014 to rapidly identify critically ill children and to expedite their care at the point of entry into a clinic. We sought to determine the impact of SCREEN on waiting times for critically ill children post real world implementation in Cape Town, South Africa. This is a prospective, observational implementation-effectiveness hybrid study that sought to determine: (1) the impact of SCREEN implementation on waiting times as a primary outcome measure, and (2) the effectiveness of the SCREEN tool in accurately identifying critically ill children when utilised by the QM and adherence by the QM to the SCREEN algorithm as secondary outcome measures. The study was conducted in two phases, Phase I control (pre-SCREEN implementation- three months in 2014) and Phase II (post-SCREEN implementation-two distinct three month periods in 2016). In Phase I, 1600 (92.38%) of 1732 children presenting to 4 clinics, had sufficient data for analysis and comprised the control sample. In Phase II, all 3383 of the children presenting to the 26 clinics during the sampling time frame had sufficient data for analysis. The proportion of critically ill children who saw a professional nurse within 10 minutes increased tenfold from 6.4% to 64% (Phase I to Phase II) with the median time to seeing a professional nurse reduced from 100.3 minutes to 4.9 minutes, (p < .001, respectively). Overall layperson screening compared to Integrated Management of Childhood Illnesses (IMCI) designation by a nurse had a sensitivity of 94.2% and a specificity of 88.1%, despite large variance in adherence to the SCREEN algorithm across clinics. The SCREEN program when implemented in a real-world setting can significantly reduce waiting times for critically ill children in PHCs, however further work is required to improve the implementation of this innovative program.

  6. Automated Age-related Macular Degeneration screening system using fundus images.

    PubMed

    Kunumpol, P; Umpaipant, W; Kanchanaranya, N; Charoenpong, T; Vongkittirux, S; Kupakanjana, T; Tantibundhit, C

    2017-07-01

    This work proposed an automated screening system for Age-related Macular Degeneration (AMD), and distinguishing between wet or dry types of AMD using fundus images to assist ophthalmologists in eye disease screening and management. The algorithm employs contrast-limited adaptive histogram equalization (CLAHE) in image enhancement. Subsequently, discrete wavelet transform (DWT) and locality sensitivity discrimination analysis (LSDA) were used to extract features for a neural network model to classify the results. The results showed that the proposed algorithm was able to distinguish between normal eyes, dry AMD, or wet AMD with 98.63% sensitivity, 99.15% specificity, and 98.94% accuracy, suggesting promising potential as a medical support system for faster eye disease screening at lower costs.

  7. Effects of gas flow rate on the etch characteristics of a low- k sicoh film with an amorphous carbon mask in dual-frequency CF4/C4F8/Ar capacitively-coupled plasmas

    NASA Astrophysics Data System (ADS)

    Kwon, Bong-Soo; Lee, Hea-Lim; Lee, Nae-Eung; Kim, Chang-Young; Choi, Chi Kyu

    2013-01-01

    Highly selective nanoscale etching of a low-dielectric constant (low- k) organosilicate (SiCOH) layer using a mask pattern of chemical-vapor-deposited (CVD) amorphous carbon layer (ACL) was carried out in CF4/C4F8/Ar dual-frequency superimposed capacitively-coupled plasmas. The etching characteristics of the SiCOH layers, such as the etch rate, etch selectivity, critical dimension (CD), and line edge roughness (LER) during the plasma etching, were investigated by varying the C4F8 flow rate. The C4F8 gas flow rate primarily was found to control the degree of polymerization and to cause variations in the selectivity, CD and LER of the patterned SiCOH layer. Process windows for ultra-high etch selectivity of the SiCOH layer to the CVD ACL are formed due to the disproportionate degrees of polymerization on the SiCOH and the ACL surfaces.

  8. Posture effects on spontaneous limb movements, alternated stepping, and the leg extension response in neonatal rats

    PubMed Central

    Mendez-Gallardo, Valerie; Roberto, Megan E.; Kauer, Sierra D.; Brumley, Michele R.

    2015-01-01

    The development of postural control is considered an important factor for the expression of coordinated behavior such as locomotion. In the natural setting of the nest, newborn rat pups adapt their posture to perform behaviors of ecological relevance such as those related to suckling. The current study explores the role of posture in the expression of three behaviors in the newborn rat: spontaneous limb activity, locomotor-like stepping behavior, and the leg extension response (LER). One-day-old rat pups were tested in one of two postures – prone or supine – on each of these behavioral measures. Results showed that pups expressed more spontaneous activity while supine, more stepping while prone, and no differences in LER expression between the two postures. Together these findings show that posture affects the expression of newborn behavior patterns in different ways, and suggest that posture may act as a facilitator or a limiting factor in the expression of different behaviors during early development. PMID:26655784

  9. ORIGIN OF THE CHAOTIC MOTION OF THE SATURNIAN SATELLITE ATLAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renner, S.; Vienne, A.; Cooper, N. J.

    2016-05-01

    We revisit the dynamics of Atlas. Using Cassini ISS astrometric observations spanning 2004 February to 2013 August, Cooper et al. found evidence that Atlas is currently perturbed by both a 54:53 corotation eccentricity resonance (CER) and a 54:53 Lindblad eccentricity resonance (LER) with Prometheus. They demonstrated that the orbit of Atlas is chaotic, with a Lyapunov time of order 10 years, as a direct consequence of the coupled resonant interaction (CER/LER) with Prometheus. Here we investigate the interactions between the two resonances using the CoraLin analytical model, showing that the chaotic zone fills almost all the corotation sites occupied bymore » the satellite's orbit. Four 70:67 apse-type mean motion resonances with Pandora are also overlapping, but these resonances have a much weaker effect. Frequency analysis allows us to highlight the coupling between the 54:53 resonances, and confirms that a simplified system including the perturbations due to Prometheus and Saturn's oblateness only captures the essential features of the dynamics.« less

  10. A new self-shielding method based on a detailed cross-section representation in the resolved energy domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saygin, H.; Hebert, A.

    The calculation of a dilution cross section {bar {sigma}}{sub e} is the most important step in the self-shielding formalism based on the equivalence principle. If a dilution cross section that accurately characterizes the physical situation can be calculated, it can then be used for calculating the effective resonance integrals and obtaining accurate self-shielded cross sections. A new technique for the calculation of equivalent cross sections based on the formalism of Riemann integration in the resolved energy domain is proposed. This new method is compared to the generalized Stamm`ler method, which is also based on an equivalence principle, for a two-regionmore » cylindrical cell and for a small pressurized water reactor assembly in two dimensions. The accuracy of each computing approach is obtained using reference results obtained from a fine-group slowing-down code named CESCOL. It is shown that the proposed method leads to slightly better performance than the generalized Stamm`ler approach.« less

  11. Thermal Vacuum Test of Ice as a Phase Change Material Integrated with a Radiator

    NASA Technical Reports Server (NTRS)

    Lee, Steve; Le, Hung; Leimkuehler, Thomas O.; Stephan, Ryan A.

    2009-01-01

    Water may be used as radiation shielding for Solar Particle Events (SPE) to protect crewmembers in the Lunar Electric Rover (LER). Because the water is already present for radiation protection, it could also provide a mass efficient solution to the vehicle's thermal control system. This water can be frozen by heat rejection from a radiator and used as a Phase Change Material (PCM) for thermal storage. Use of this water as a PCM can eliminate the need for a pumped fluid loop thermal control system as well as reduce the required size of the radiator. This paper describes the testing and analysis performed for the Rover Engineering Development Unit (REDU), a scaled-down version of a water PCM heat sink for the LER. The REDU was tested in a thermal-vacuum chamber at environmental temperatures similar to those of a horizontal radiator panel on the lunar surface. Testing included complete freeze and melt cycles along with scaled transient heat load profiles simulating a 24-hour day for the rover.

  12. Shot noise limit of chemically amplified resists with photodecomposable quenchers used for extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2017-06-01

    In lithography using high-energy photons such as an extreme ultraviolet (EUV) radiation, the shot noise of photons is a critical issue. The shot noise is a cause of line edge/width roughness (LER/LWR) and stochastic defect generation and limits the resist performance. In this study, the effects of photodecomposable quenchers were investigated from the viewpoint of the shot noise limit. The latent images of line-and-space patterns with 11 nm half-pitch were calculated using a Monte Carlo method. In the simulation, the effect of secondary electron blur was eliminated to clarify the shot noise limits regarding stochastic phenomena such as LER. The shot noise limit for chemically amplified resists with acid generators and photodecomposable quenchers was approximately the same as that for chemically amplified resists with acid generators and conventional quenchers when the total sensitizer concentration was the same. The effect of photodecomposable quenchers on the shot noise limit was essentially the same as that of acid generators.

  13. Origin of the Chaotic Motion of the Saturnian Satellite Atlas

    NASA Astrophysics Data System (ADS)

    Renner, S.; Cooper, N. J.; El Moutamid, M.; Sicardy, B.; Vienne, A.; Murray, C. D.; Saillenfest, M.

    2016-05-01

    We revisit the dynamics of Atlas. Using Cassini ISS astrometric observations spanning 2004 February to 2013 August, Cooper et al. found evidence that Atlas is currently perturbed by both a 54:53 corotation eccentricity resonance (CER) and a 54:53 Lindblad eccentricity resonance (LER) with Prometheus. They demonstrated that the orbit of Atlas is chaotic, with a Lyapunov time of order 10 years, as a direct consequence of the coupled resonant interaction (CER/LER) with Prometheus. Here we investigate the interactions between the two resonances using the CoraLin analytical model, showing that the chaotic zone fills almost all the corotation sites occupied by the satellite's orbit. Four 70:67 apse-type mean motion resonances with Pandora are also overlapping, but these resonances have a much weaker effect. Frequency analysis allows us to highlight the coupling between the 54:53 resonances, and confirms that a simplified system including the perturbations due to Prometheus and Saturn's oblateness only captures the essential features of the dynamics.

  14. Effect of a culture-based screening algorithm on tuberculosis incidence in immigrants and refugees bound for the United States: a population-based cross-sectional study.

    PubMed

    Liu, Yecai; Posey, Drew L; Cetron, Martin S; Painter, John A

    2015-03-17

    Before 2007, immigrants and refugees bound for the United States were screened for tuberculosis (TB) by a smear-based algorithm that could not diagnose smear-negative/culture-positive TB. In 2007, the Centers for Disease Control and Prevention implemented a culture-based algorithm. To evaluate the effect of the culture-based algorithm on preventing the importation of TB to the United States by immigrants and refugees from foreign countries. Population-based, cross-sectional study. Panel physician sites for overseas medical examination. Immigrants and refugees with TB. Comparison of the increase of smear-negative/culture-positive TB cases diagnosed overseas among immigrants and refugees by the culture-based algorithm with the decline of reported cases among foreign-born persons within 1 year after arrival in the United States from 2007 to 2012. Of the 3 212 421 arrivals of immigrants and refugees from 2007 to 2012, a total of 1 650 961 (51.4%) were screened by the smear-based algorithm and 1 561 460 (48.6%) were screened by the culture-based algorithm. Among the 4032 TB cases diagnosed by the culture-based algorithm, 2195 (54.4%) were smear-negative/culture-positive. Before implementation (2002 to 2006), the annual number of reported cases among foreign-born persons within 1 year after arrival was relatively constant (range, 1424 to 1626 cases; mean, 1504 cases) but decreased from 1511 to 940 cases during implementation (2007 to 2012). During the same period, the annual number of smear-negative/culture-positive TB cases diagnosed overseas among immigrants and refugees bound for the United States by the culture-based algorithm increased from 4 to 629. This analysis did not control for the decline in new arrivals of nonimmigrant visitors to the United States and the decrease of incidence of TB in their countries of origin. Implementation of the culture-based algorithm may have substantially reduced the incidence of TB among newly arrived, foreign-born persons in the United States. None.

  15. Screening Algorithm to Guide Decisions on Whether to Conduct a Health Impact Assessment

    EPA Pesticide Factsheets

    Provides a visual aid in the form of a decision algorithm that helps guide discussions about whether to proceed with an HIA. The algorithm can help structure, standardize, and document the decision process.

  16. Effect of TE Mode Power on the PEP II LER BPM System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, Cho-K

    2011-08-26

    The beam chamber of the PEP-II B-Factory Low Energy Ring (LER) arc sections is connected to an antechamber for the absorption of synchrotron radiation on discrete photon stops. The presence of the antechamber substantially reduces the cutoff frequency of the vacuum chamber and, in particular, allows the propagation of higher-order-mode (HOM) TE power generated by beamline components at the BPM signal processing frequency. Calculations of the transmission properties of the TE mode in different sections of the vacuum chamber show that the power is trapped between widely separated bellows in the arc sections. Because of the narrow signal bandwidth andmore » weak coupling of the TE mode to the BPM buttons, the noise contributed by the HOM TE power will not produce a noticeable effect on the BPM position signal voltage. The LER arc vacuum chamber employs an antechamber with a discrete photon stop for absorption of synchrotron radiation and with pumps for maintaining pressure below 10 nTorr [1]. The horizontal dimensions of the antechambers at the pumping chamber section and the magnet chamber section are larger or comparable to that of the beam chamber. Because of the increase in the horizontal dimension, the cutoff frequency of the TE10-like mode (in rectangular coordinates) of the vacuum chamber is considerably reduced and, in particular, is less than the BPM signal processing frequency at 952 MHz. TE power propagating in the vacuum chamber will penetrate through the BPM buttons and will affect the pickup signal if its magnitude is not properly controlled. It is the purpose of this note to clarify various issues pertaining to this problem. TE power is generated when the beam passes a noncylindrically symmetric beamline component such as the RF cavity, the injection region, the IR crotch and the IP region. The beampipes connected to these components have TE cutoff frequencies greater than 952 MHz (for example, the TE cutoff frequency of the RF cavity beampipe is 1.8 GHz), and hence no TE power at this frequency propagates from the component. TE power can also be generated by the scattering of TM power through these beamline components. Since the cutoff frequency of the TM mode is in general higher than that of the TE mode, this mechanism is not pertinent to the problem related to the BPM signal. Consequently, the TE power that needs to be considered is mainly generated by components of the LER arc vacuum chamber, where the TE cutoff frequency is less than the BPM processing frequency.« less

  17. Implementing a Multiple Criteria Model Base in Co-Op with a Graphical User Interface Generator

    DTIC Science & Technology

    1993-09-23

    PROMETHEE ................................ 44 A. THE ALGORITHM S ................................... 44 1. Basic Algorithm of PROMETHEE I and... PROMETHEE II ..... 45 a. Use of the Algorithm in PROMETHEE I ............. 49 b. Use of the Algorithm in PROMETHEE II ............. 50 V 2. Algorithm of... PROMETHEE V ......................... 50 B. SCREEN DESIGNS OF PROMETHEE ...................... 51 1. PROMETHEE I and PROMETHEE II ................... 52 a

  18. Flexible methods for segmentation evaluation: results from CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry

    2014-01-01

    Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.

  19. Field-expedient screening and injury risk algorithm categories as predictors of noncontact lower extremity injury.

    PubMed

    Lehr, M E; Plisky, P J; Butler, R J; Fink, M L; Kiesel, K B; Underwood, F B

    2013-08-01

    In athletics, efficient screening tools are sought to curb the rising number of noncontact injuries and associated health care costs. The authors hypothesized that an injury prediction algorithm that incorporates movement screening performance, demographic information, and injury history can accurately categorize risk of noncontact lower extremity (LE) injury. One hundred eighty-three collegiate athletes were screened during the preseason. The test scores and demographic information were entered into an injury prediction algorithm that weighted the evidence-based risk factors. Athletes were then prospectively followed for noncontact LE injury. Subsequent analysis collapsed the groupings into two risk categories: Low (normal and slight) and High (moderate and substantial). Using these groups and noncontact LE injuries, relative risk (RR), sensitivity, specificity, and likelihood ratios were calculated. Forty-two subjects sustained a noncontact LE injury over the course of the study. Athletes identified as High Risk (n = 63) were at a greater risk of noncontact LE injury (27/63) during the season [RR: 3.4 95% confidence interval 2.0 to 6.0]. These results suggest that an injury prediction algorithm composed of performance on efficient, low-cost, field-ready tests can help identify individuals at elevated risk of noncontact LE injury. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. GPURFSCREEN: a GPU based virtual screening tool using random forest classifier.

    PubMed

    Jayaraj, P B; Ajay, Mathias K; Nufail, M; Gopakumar, G; Jaleel, U C A

    2016-01-01

    In-silico methods are an integral part of modern drug discovery paradigm. Virtual screening, an in-silico method, is used to refine data models and reduce the chemical space on which wet lab experiments need to be performed. Virtual screening of a ligand data model requires large scale computations, making it a highly time consuming task. This process can be speeded up by implementing parallelized algorithms on a Graphical Processing Unit (GPU). Random Forest is a robust classification algorithm that can be employed in the virtual screening. A ligand based virtual screening tool (GPURFSCREEN) that uses random forests on GPU systems has been proposed and evaluated in this paper. This tool produces optimized results at a lower execution time for large bioassay data sets. The quality of results produced by our tool on GPU is same as that on a regular serial environment. Considering the magnitude of data to be screened, the parallelized virtual screening has a significantly lower running time at high throughput. The proposed parallel tool outperforms its serial counterpart by successfully screening billions of molecules in training and prediction phases.

  1. Development of Type 2 Diabetes Mellitus Phenotyping Framework Using Expert Knowledge and Machine Learning Approach.

    PubMed

    Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko

    2017-07-01

    Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.

  2. Results of delayed triage by HPV testing and cytology in the Norwegian Cervical Cancer Screening Programme.

    PubMed

    Haldorsen, Tor; Skare, Gry Baadstrand; Ursin, Giske; Bjørge, Tone

    2015-02-01

    High-risk human papilloma virus (hrHPV) testing was added to the cytology triage of women with equivocal screening smears in the Norwegian programme for cervical cancer screening in 2005. In this population-based observational before and after study we assessed the effect of changing the screening algorithm. In periods before and after the change 75 852 and 66 616 women, respectively, were eligible for triage, i.e. they had smear results of unsatisfactory, atypical squamous cells of undetermined significance (ASC-US), or low-grade squamous intraepithelial lesion (LSIL) at routine screening. The triage was delayed as supplementary testing started six months after the initial screening. The groups were compared with respect to results of triage and later three-year cumulative incidence of cervical intraepithelial neoplasia grade 2 or worse (CIN2+). Before and after the change in the screening algorithm 5.2% (3964/75 852) and 8.1% (5417/66 616) of women, respectively, were referred to colposcopy. Among women referred to colposcopy cumulative incidence of CIN2+ (positive predictive value of referral) increased from 42.0% [95% confidence interval (CI): 40.3 - 43.7%] in the period with cytology only to 48.0% (95% CI 46.6 - 49.4%) after the start of HPV testing. For women recalled to ordinary screening the three-year cumulative incidence decreased from 2.7% (95% CI 2.5 - 2.9%) to 1.0% (95% CI 0.9 - 1.2%) during the same period. Among women with LSIL at routine screening and HPV testing in triage, 52.5% (1976/3766) were HPV positive. The new algorithm with HPV testing implemented in 2005 resulted in an increased rate of referral to colposcopy, but in a better risk stratification with respect to precancerous disease.

  3. When drug discovery meets web search: Learning to Rank for ligand-based virtual screening.

    PubMed

    Zhang, Wei; Ji, Lijuan; Chen, Yanan; Tang, Kailin; Wang, Haiping; Zhu, Ruixin; Jia, Wei; Cao, Zhiwei; Liu, Qi

    2015-01-01

    The rapid increase in the emergence of novel chemical substances presents a substantial demands for more sophisticated computational methodologies for drug discovery. In this study, the idea of Learning to Rank in web search was presented in drug virtual screening, which has the following unique capabilities of 1). Applicable of identifying compounds on novel targets when there is not enough training data available for these targets, and 2). Integration of heterogeneous data when compound affinities are measured in different platforms. A standard pipeline was designed to carry out Learning to Rank in virtual screening. Six Learning to Rank algorithms were investigated based on two public datasets collected from Binding Database and the newly-published Community Structure-Activity Resource benchmark dataset. The results have demonstrated that Learning to rank is an efficient computational strategy for drug virtual screening, particularly due to its novel use in cross-target virtual screening and heterogeneous data integration. To the best of our knowledge, we have introduced here the first application of Learning to Rank in virtual screening. The experiment workflow and algorithm assessment designed in this study will provide a standard protocol for other similar studies. All the datasets as well as the implementations of Learning to Rank algorithms are available at http://www.tongji.edu.cn/~qiliu/lor_vs.html. Graphical AbstractThe analogy between web search and ligand-based drug discovery.

  4. Has universal screening with Xpert® MTB/RIF increased the proportion of multidrug-resistant tuberculosis cases diagnosed in a routine operational setting?

    PubMed Central

    Dunbar, Rory; Caldwell, Judy; Lombard, Carl; Beyers, Nulda

    2017-01-01

    Setting Primary health services in Cape Town, South Africa where the introduction of Xpert® MTB/RIF (Xpert) enabled simultaneous screening for tuberculosis (TB) and drug susceptibility in all presumptive cases. Study aim To compare the proportion of TB cases with drug susceptibility tests undertaken and multidrug-resistant tuberculosis (MDR-TB) diagnosed pre-treatment and during the course of 1st line treatment in the previous smear/culture and the newly introduced Xpert-based algorithms. Methods TB cases identified in a previous stepped-wedge study of TB yield in five sub-districts over seven one-month time-points prior to, during and after the introduction of the Xpert-based algorithm were analysed. We used a combination of patient identifiers to identify all drug susceptibility tests undertaken from electronic laboratory records. Differences in the proportions of DST undertaken and MDR-TB cases diagnosed between algorithms were estimated using a binomial regression model. Results Pre-treatment, the probability of having a DST undertaken (RR = 1.82)(p<0.001) and being diagnosed with MDR-TB (RR = 1.42)(p<0.001) was higher in the Xpert-based algorithm than in the smear/culture-based algorithm. For cases evaluated during the course of 1st-line TB treatment, there was no significant difference in the proportion with DST undertaken (RR = 1.02)(p = 0.848) or MDR-TB diagnosed (RR = 1.12)(p = 0.678) between algorithms. Conclusion Universal screening for drug susceptibility in all presumptive TB cases in the Xpert-based algorithm resulted in a higher overall proportion of MDR-TB cases being diagnosed and is an important strategy in reducing transmission. The previous strategy of only screening new TB cases when 1st line treatment failed did not compensate for cases missed pre-treatment. PMID:28199375

  5. Shape-Based Virtual Screening with Volumetric Aligned Molecular Shapes

    PubMed Central

    Koes, David Ryan; Camacho, Carlos J.

    2014-01-01

    Shape-based virtual screening is an established and effective method for identifying small molecules that are similar in shape and function to a reference ligand. We describe a new method of shape-based virtual screening, volumetric aligned molecular shapes (VAMS). VAMS uses efficient data structures to encode and search molecular shapes. We demonstrate that VAMS is an effective method for shape-based virtual screening and that it can be successfully used as a pre-filter to accelerate more computationally demanding search algorithms. Unique to VAMS is a novel minimum/maximum shape constraint query for precisely specifying the desired molecular shape. Shape constraint searches in VAMS are particularly efficient and millions of shapes can be searched in a fraction of a second. We compare the performance of VAMS with two other shape-based virtual screening algorithms a benchmark of 102 protein targets consisting of more than 32 million molecular shapes and find that VAMS provides a competitive trade-off between run-time performance and virtual screening performance. PMID:25049193

  6. Care of the suicidal pediatric patient in the ED: a case study.

    PubMed

    Schmid, Alexis M; Truog, Amy W; Damian, Frances J

    2011-09-01

    The suicide rate among children and adolescents has increased worldwide over the past few decades, and many who attempt suicide are first seen at EDs. At Childrens Hospital Boston (CHB), an algorithm-the Risk of Suicidality Clinical Practice Algorithm-has been developed to ensure evidence-based care supported by best practice guidelines. The authors of this article provide an overview of pediatric suicide and suicide attempts; describe screening, assessment, and interventions used at CHB; and discuss the nursing implications. An illustrative case study is also provided. algorithm, Asperger's syndrome, attempted suicide, bullying, emergency, emergency department, patient safety, pediatrics, pediatric suicide, suicide, suicide screening, triage.

  7. Effect of Antiretroviral Therapy on the Diagnostic Accuracy of Symptom Screening for Intensified Tuberculosis Case Finding in a South African HIV Clinic

    PubMed Central

    Rangaka, Molebogeng X.; Wilkinson, Robert J.; Glynn, Judith R.; Boulle, Andrew; van Cutsem, Gilles; Goliath, Rene; Mathee, Shaheed; Maartens, Gary

    2012-01-01

    Background. Current symptom screening algorithms for intensified tuberculosis case finding or prior to isoniazid preventive therapy (IPT) in patients infected with human immunodeficiency virus (HIV) were derived from antiretroviral-naive cohorts. There is a need to validate screening algorithms in patients on antiretroviral therapy (ART). Methods. We performed cross-sectional evaluation of the diagnostic accuracy of symptom screening, including the World Health Organization (WHO) algorithm, to rule out tuberculosis in HIV-infected individuals pre-ART and on ART undergoing screening prior to IPT. Results. A total of 1429 participants, 54% on ART, had symptom screening and a sputum culture result available. Culture-positive tuberculosis was diagnosed in 126 patients (8.8%, 95% confidence interval [CI], 7.4%–10.4%). The WHO symptom screen in the on-ART compared with the pre-ART group had a lower sensitivity (23.8% vs 47.6%), but higher specificity (94.4% vs 79.8%). The effect of ART was independent of CD4+ count in multivariable analyses. The posttest probability of tuberculosis following a negative WHO screen was 8.9% (95% CI, 7.4%–10.8%) and 4.4% (95% CI, 3.7%–5.2%) for the pre-ART and on-ART groups, respectively. Addition of body mass index to the WHO screen significantly improved discriminatory ability in both ART groups, which was further improved by adding CD4 count and ART duration. Conclusions. The WHO symptom screen has poor sensitivity, especially among patients on ART, in a clinic where regular tuberculosis screening is practiced. Consequently, a significant proportion of individuals with tuberculosis would inadvertently be placed on isoniazid monotherapy despite high negative predictive values. Until more sensitive methods of ruling out tuberculosis are established, it would be prudent to do a sputum culture prior to IPT where this is feasible. PMID:22955441

  8. Discovery of Novel HIV-1 Integrase Inhibitors Using QSAR-Based Virtual Screening of the NCI Open Database.

    PubMed

    Ko, Gene M; Garg, Rajni; Bailey, Barbara A; Kumar, Sunil

    2016-01-01

    Quantitative structure-activity relationship (QSAR) models can be used as a predictive tool for virtual screening of chemical libraries to identify novel drug candidates. The aims of this paper were to report the results of a study performed for descriptor selection, QSAR model development, and virtual screening for identifying novel HIV-1 integrase inhibitor drug candidates. First, three evolutionary algorithms were compared for descriptor selection: differential evolution-binary particle swarm optimization (DE-BPSO), binary particle swarm optimization, and genetic algorithms. Next, three QSAR models were developed from an ensemble of multiple linear regression, partial least squares, and extremely randomized trees models. A comparison of the performances of three evolutionary algorithms showed that DE-BPSO has a significant improvement over the other two algorithms. QSAR models developed in this study were used in consensus as a predictive tool for virtual screening of the NCI Open Database containing 265,242 compounds to identify potential novel HIV-1 integrase inhibitors. Six compounds were predicted to be highly active (plC50 > 6) by each of the three models. The use of a hybrid evolutionary algorithm (DE-BPSO) for descriptor selection and QSAR model development in drug design is a novel approach. Consensus modeling may provide better predictivity by taking into account a broader range of chemical properties within the data set conducive for inhibition that may be missed by an individual model. The six compounds identified provide novel drug candidate leads in the design of next generation HIV- 1 integrase inhibitors targeting drug resistant mutant viruses.

  9. The PHQ-PD as a Screening Tool for Panic Disorder in the Primary Care Setting in Spain

    PubMed Central

    Wood, Cristina Mae; Ruíz-Rodríguez, Paloma; Tomás-Tomás, Patricia; Gracia-Gracia, Irene; Dongil-Collado, Esperanza; Iruarrizaga, M. Iciar

    2016-01-01

    Introduction Panic disorder is a common anxiety disorder and is highly prevalent in Spanish primary care centres. The use of validated tools can improve the detection of panic disorder in primary care populations, thus enabling referral for specialized treatment. The aim of this study is to determine the accuracy of the Patient Health Questionnaire-Panic Disorder (PHQ-PD) as a screening and diagnostic tool for panic disorder in Spanish primary care centres. Method We compared the psychometric properties of the PHQ-PD to the reference standard, the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I) interview. General practitioners referred 178 patients who completed the entire PHQ test, including the PHQ-PD, to undergo the SCID-I. The sensitivity, specificity, positive and negative predictive values and positive and negative likelihood ratios of the PHQ-PD were assessed. Results The operating characteristics of the PHQ-PD are moderate. The best cut-off score was 5 (sensitivity .77, specificity .72). Modifications to the questionnaire's algorithms improved test characteristics (sensitivity .77, specificity .72) compared to the original algorithm. The screening question alone yielded the highest sensitivity score (.83). Conclusion Although the modified algorithm of the PHQ-PD only yielded moderate results as a diagnostic test for panic disorder, it was better than the original. Using only the first question of the PHQ-PD showed the best psychometric properties (sensitivity). Based on these findings, we suggest the use of the screening questions for screening purposes and the modified algorithm for diagnostic purposes. PMID:27525977

  10. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography.

    PubMed

    Treiber, O; Wanninger, F; Führ, H; Panzer, W; Regulla, D; Winkler, G

    2003-02-21

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing. a dose reduction by 25% has no serious influence on the detection results. whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.

  11. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography

    NASA Astrophysics Data System (ADS)

    Treiber, O.; Wanninger, F.; Führ, H.; Panzer, W.; Regulla, D.; Winkler, G.

    2003-02-01

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing, a dose reduction by 25% has no serious influence on the detection results, whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.

  12. Cockpit Display of Traffic Information and the Measurement of Pilot Workload: An Annotated Bibliography

    DTIC Science & Technology

    1982-02-01

    fragile data bases. With correction of this deficiency and further research on time sharing behavior or function interlacing, these methods should prove to...parameLers used were (1) amplitude - as a measure of amplitude variations, the difference between two successive heart rate values- tk2 ) frequency - as a

  13. U.S. EPA, Pesticide Product Label, , 06/30/1976

    EPA Pesticide Factsheets

    2011-04-14

    ... l' -/VI' i'- I~Q !h(. ~"'cliL;t ~ (If do iln~,rt',J ",i',:'- If _d', r"'-l ,l't" ! 'l{- 1fl1() ;f,~ i)t lppwtt: lf1 t'lhltll)f I, ,'- A Hest '1I'dP) . It !, v' ·1''''' l!er)! ... I . " " A-Hest T ablf' ' " ...

  14. Interfacial fluctuations of block copolymers: a coarse-grain molecular dynamics simulation study.

    PubMed

    Srinivas, Goundla; Swope, William C; Pitera, Jed W

    2007-12-13

    The lamellar and cylindrical phases of block copolymers have a number of technological applications, particularly when they occur in supported thin films. One such application is block copolymer lithography, the use of these materials to subdivide or enhance submicrometer patterns defined by optical or electron beam methods. A key parameter of all lithographic methods is the line edge roughness (LER), because the electronic or optical activities of interest are sensitive to small pattern variations. While mean-field models provide a partial picture of the LER and interfacial width expected for the block interface in a diblock copolymer, these models lack chemical detail. To complement mean-field approaches, we have carried out coarse-grain molecular dynamics simulations on model poly(ethyleneoxide)-poly(ethylethylene) (PEO-PEE) lamellae, exploring the influence of chain length and hypothetical chemical modifications on the observed line edge roughness. As expected, our simulations show that increasing chi (the Flory-Huggins parameter) is the most direct route to decreased roughness, although the addition of strong specific interactions at the block interface can also produce smoother patterns.

  15. Thermal Vacuum Test of Ice as a Phase Change Material Integrated with a Radiator

    NASA Technical Reports Server (NTRS)

    Lee, Steve A.; Leimkuehler, Thomas O.; Stephan, Ryan; Le, Hung V.

    2010-01-01

    Water may be used as radiation shielding for Solar Particle Events (SPE) to protect crewmembers in the Lunar Electric Rover (LER). Because the water is already present for radiation protection, it could also provide a mass efficient solution to the vehicle's thermal control system. This water can be frozen by heat rejection from a radiator and used as a Phase Change Material (PC1V1) for thermal storage. Use of this water as a PCM can eliminate the need for a pumped fluid loop thermal control system as well as reduce the required size of the radiator. This paper describes the testing and analysis performed for the Rover Engineering Development Unit (REDU), a scaled-down version of a water PCM heat sink for the LER. The REDU was tested in a thermal-vacuum chamber at environmental temperatures similar to those of a horizontal radiator panel on the lunar surface. Testing included complete freeze and melt cycles along with scaled transient heat load profiles simulating a 24-hour day for the rover.

  16. Screening for cystic fibrosis in New York State: considerations for algorithm improvements.

    PubMed

    Kay, Denise M; Maloney, Breanne; Hamel, Rhonda; Pearce, Melissa; DeMartino, Lenore; McMahon, Rebecca; McGrath, Emily; Krein, Lea; Vogel, Beth; Saavedra-Matiz, Carlos A; Caggana, Michele; Tavakoli, Norma P

    2016-02-01

    Newborn screening for cystic fibrosis (CF), a chronic progressive disease affecting mucus viscosity, has been beneficial in both improving life expectancy and the quality of life for individuals with CF. In New York State from 2007 to 2012 screening for CF involved measuring immunoreactive trypsinogen (IRT) levels in dried blood spots from newborns using the IMMUCHEM(™) Blood Spot Trypsin-MW ELISA kit. Any specimen in the top 5% IRT level underwent DNA analysis using the InPlex(®) CF Molecular Test. Of the 1.48 million newborns screened during the 6-year time period, 7631 babies were referred for follow-up. CF was confirmed in 251 cases, and 94 cases were diagnosed with CF transmembrane conductance regulated-related metabolic syndrome or possible CF. Nine reports of false negatives were made to the program. Variation in daily average IRT was observed depending on the season (4-6 ng/ml) and kit lot (<3 ng/ml), supporting the use of a floating cutoff. The screening method had a sensitivity of 96.5%, specificity of 99.6%, positive predictive value of 4.5%, and negative predictive value of 99.5%. Considerations for CF screening algorithms should include IRT variations resulting from age at specimen collection, sex, race/ethnicity, season, and manufacturer kit lots. Measuring IRT level in dried blood spots is the first-tier screen for CF. Current algorithms for CF screening lead to substantial false-positive referral rates. IRT values were affected by age of infant when specimen is collected, race/ethnicity and sex of infant, and changes in seasons and manufacturer kit lots The prevalence of CF in NYS is 1 in 4200 with the highest prevalence in White infants (1 in 2600) and the lowest in Black infants (1 in 15,400).

  17. Sensitivity and specificity of automated analysis of single-field non-mydriatic fundus photographs by Bosch DR Algorithm-Comparison with mydriatic fundus photography (ETDRS) for screening in undiagnosed diabetic retinopathy.

    PubMed

    Bawankar, Pritam; Shanbhag, Nita; K, S Smitha; Dhawan, Bodhraj; Palsule, Aratee; Kumar, Devesh; Chandel, Shailja; Sood, Suneet

    2017-01-01

    Diabetic retinopathy (DR) is a leading cause of blindness among working-age adults. Early diagnosis through effective screening programs is likely to improve vision outcomes. The ETDRS seven-standard-field 35-mm stereoscopic color retinal imaging (ETDRS) of the dilated eye is elaborate and requires mydriasis, and is unsuitable for screening. We evaluated an image analysis application for the automated diagnosis of DR from non-mydriatic single-field images. Patients suffering from diabetes for at least 5 years were included if they were 18 years or older. Patients already diagnosed with DR were excluded. Physiologic mydriasis was achieved by placing the subjects in a dark room. Images were captured using a Bosch Mobile Eye Care fundus camera. The images were analyzed by the Retinal Imaging Bosch DR Algorithm for the diagnosis of DR. All subjects also subsequently underwent pharmacological mydriasis and ETDRS imaging. Non-mydriatic and mydriatic images were read by ophthalmologists. The ETDRS readings were used as the gold standard for calculating the sensitivity and specificity for the software. 564 consecutive subjects (1128 eyes) were recruited from six centers in India. Each subject was evaluated at a single outpatient visit. Forty-four of 1128 images (3.9%) could not be read by the algorithm, and were categorized as inconclusive. In four subjects, neither eye provided an acceptable image: these four subjects were excluded from the analysis. This left 560 subjects for analysis (1084 eyes). The algorithm correctly diagnosed 531 of 560 cases. The sensitivity, specificity, and positive and negative predictive values were 91%, 97%, 94%, and 95% respectively. The Bosch DR Algorithm shows favorable sensitivity and specificity in diagnosing DR from non-mydriatic images, and can greatly simplify screening for DR. This also has major implications for telemedicine in the use of screening for retinopathy in patients with diabetes mellitus.

  18. The Effect of Mounting Vortex Generators on the DTU 10MW Reference Wind Turbine Blade

    NASA Astrophysics Data System (ADS)

    Skrzypiński, Witold; Gaunaa, Mac; Bak, Christian

    2014-06-01

    The aim of the current work is to analyze possible advantages of mounting Vortex Generators (VG's) on a wind turbine blade. Specifically, the project aims at investigating at which radial sections of the DTU 10 MW Reference Wind Turbine blade it is most beneficial to mount the VG's in order to increase the Annual Energy Production (AEP) under realistic conditions. The present analysis was carried out in several steps: (1) The clean two dimensional airfoil characteristics were first modified to emulate the effect of all possible combinations of VG's (1% high at suction side x/c=0.2-0.25) and two Leading Edge Roughness (LER) values along the whole blade span. (2) The combinations from Step 1, including the clean case were subsequently modified to take into account three dimensional effects. (3) BEM computations were carried out to determine the aerodynamic rotor performance using each of the datasets from Step 2 along the whole blade span for all wind speeds in the turbine control scheme. (4) Employing the assumption of radial independence between sections of the blades, and using the results of the BEM computations described in Step 3, it is possible to determine for each radial position independently whether it is beneficial to install VG's in the smooth and LER cases, respectively. The results indicated that surface roughness that corresponds to degradation of the power curve may to some extent be mitigated by installation of VG's. The present results also indicated that the optimal VG configuration in terms of maximizing AEP depends on the degree of severity of the LER. This is because, depending on the condition of blade surface, installation of VG's on an incorrect blade span or installation of VG's too far out on the blade may cause loss in AEP. The results also indicated that the worse condition of the blade surface, the more gain may be obtained from the installation of VG's.

  19. Changes in cloud and aerosol cover (1980-2006) from reflectivity time series using SeaWiFS, N7-TOMS, EP-TOMS, SBUV-2, and OMI radiance data

    NASA Astrophysics Data System (ADS)

    Herman, J. R.; Labow, G.; Hsu, N. C.; Larko, D.

    2009-01-01

    The amount of solar radiation reflected back to space or reaching the Earth's surface is primarily governed by the amount of cloud cover and, to a much lesser extent, by Rayleigh scattering, aerosols, and various absorbing gases (e.g., O3, NO2, H2O). A useful measure of the effect of cloud plus aerosol cover is given by the amount that the 331 nm Lambert Equivalent Reflectivity (LER) of a scene exceeds the surface reflectivity for snow/ice-free scenes after Rayleigh scattering has been removed. Twenty-eight years of reflectivity data are available by overlapping data from several satellites: N7 (Nimbus 7, TOMS; 331 nm) from 1979 to 1992, SBUV-2 series (Solar Backscatter Ultraviolet, NOAA; 331 nm) 1985 to 2007, EP (Earth-Probe, TOMS; 331 nm) 1997 to 2006, SW (SeaWiFS; 412 nm) 1998 to 2006, and OMI (Ozone Measuring Instrument; 331 nm) 2004-2007. Only N7 and SW have a sufficiently long data record, Sun-synchronous orbits, and are adequately calibrated for long-term reflectivity trend estimation. Reflectivity data derived from these instruments and the SBUV-2 series are compared during the overlapping years. Key issues in determining long-term reflectivity changes that have occurred during the N7 and SW operating periods are discussed. The largest reflectivity changes in the 412 nm SW LER and 331 nm EP LER are found to occur near the equator and are associated with a large El Nino-Southern Oscillation event. Most other changes that have occurred are regional, such as the apparent cloud decrease over northern Europe since 1998. The fractional occurrence (fraction of days) of high reflectivity values over Hudson Bay, Canada (snow/ice and clouds) appears to have decreased when comparing reflectivity data from 1980 to 1992 to 1997-2006, suggesting shorter duration of ice in Hudson Bay since 1980.

  20. The LER/LWR metrology challenge for advance process control through 3D-AFM and CD-SEM

    NASA Astrophysics Data System (ADS)

    Faurie, P.; Foucher, J.; Foucher, A.-L.

    2009-12-01

    The continuous shrinkage in dimensions of microelectronic devices has reached such level, with typical gate length in advance R&D of less than 20nm combine with the introduction of new architecture (FinFET, Double gate...) and new materials (porous interconnect material, 193 immersion resist, metal gate material, high k materials...), that new process parameters have to be well understood and well monitored to guarantee sufficient production yield in a near future. Among these parameters, there are the critical dimensions (CD) associated to the sidewall angle (SWA) values, the line edge roughness (LER) and the line width roughness (LWR). Thus, a new metrology challenge has appeared recently and consists in measuring "accurately" the fabricated patterns on wafers in addition to measure the patterns on a repeatable way. Therefore, a great effort has to be done on existing techniques like CD-SEM, Scatterometry and 3D-AFM in order to develop them following the two previous criteria: Repeatability and Accuracy. In this paper, we will compare the 3D-AFM and CD-SEM techniques as a mean to measure LER and LWR on silicon and 193 resist and point out CD-SEM impact on the material during measurement. Indeed, depending on the material type, the interaction between the electron beam and the material or between the AFM tip and the material can vary a lot and subsequently can generate measurements bias. The first results tend to show that depending on CD-SEM conditions (magnification, number of acquisition frames) the final outputs can vary on a large range and therefore show that accuracy in such measurements are really not obvious to obtain. On the basis of results obtained on various materials that present standard sidewall roughness, we will show the limit of each technique and will propose different ways to improve them in order to fulfil advance roadmap requirements for the development of the next IC generation.

  1. Strain and sex differences in puberty onset and the effects of THC administration on weight gain and brain volumes.

    PubMed

    Keeley, R J; Trow, J; McDonald, R J

    2015-10-01

    The use of recreational marijuana is widespread and frequently begins and persists through adolescence. Some research has shown negative consequences of adolescent marijuana use, but this is not seen across studies, and certain factors, like genetic background and sex, may influence the results. It is critical to identify which characteristics predispose an individual to be susceptible to the negative consequences of chronic exposure to marijuana in adolescence on brain health and behavior. To this end, using males and females of two strains of rats, Long-Evans hooded (LER) and Wistar (WR) rats, we explored whether these anatomically and behaviorally dimorphic strains demonstrated differences in puberty onset and strain-specific effects of adolescent exposure to Δ9-tetrahydrocannabinol (THC), the main psychoactive component of marijuana. Daily 5 mg/kg treatment began on the day of puberty onset and continued for 14 days. Of particular interest were metrics of growth and volumetric estimates of brain areas involved in cognition that contain high densities of cannabinoid receptors, including the hippocampus and its subregions, the amygdala, and the frontal cortex. Brain volumetrics were analyzed immediately following the treatment period. LER and WR females started puberty at different ages, but no strain differences were observed in brain volumes. THC decreased weight gain throughout the treatment period for all groups. Only the hippocampus and some of its subregions were affected by THC, and increased volumes with THC administration was observed exclusively in females, regardless of strain. Long-term treatment of THC did not affect all individuals equally, and females displayed evidence of increased sensitivity to the effects of THC, and by extension, marijuana. Identifying differences in adolescent physiology of WR and LER rats could help determine the cause for strain and sex differences in brain and behavior of adults and help to refine the use of animal models in marijuana research. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Changes in Cloud and Aerosol Cover (1980-2006) from Reflectivity Time Series Using SeaWiFS, N7-TOMS, EP-TOMS, SBUV-2, and OMI Radiance Data

    NASA Technical Reports Server (NTRS)

    Herman, J. R.; Labow, G.; Hsu, N. C.; Larko, D.

    2009-01-01

    The amount of solar radiation reflected back to space or reaching the Earth's surface is primarily governed by the amount of cloud cover and, to a much lesser extent, by Rayleigh scatteri ng, aerosols, and various absorbing gases (e.g., O3, NO2, H2O). A useful measure of the effect of cloud plus aerosol cover is given by the amount that the 331 run Lambert Equivalent Reflectivity (LER) ofa scene exceeds the surfuce reflectivity for snow/ice-free scenes after Rayleigh scattering has been removed. Twenty-eight years of reflectivity data are available by overlapping data from several satellites: N7 (Nimbus 7, TOMS; 331 nm) from 1979 to 1992, SBUV-2 series (Solar Backscatter Ultraviolet, NOAA; 331 nm) 1985 to 2007, EP (Earth-Probe, TOMS; 331 nm) 1997 to 2006, SW (SeaWiFS; 412 nm) 1998 to 2006, and OMI (Ozone Measuring Instrument; 331 nm) 2004-2007. Only N7 and SW have a sufficiently long data record, Sun-synchronous orbits, and are adequately calibrated for long-term reflectivity trend estimation. Reflectivity data derived from these instruments and the SBUV-2 series are compared during the overlapping years. Key issues in determining long-term reflecti vity changes that have occurred during the N7 and SW operating periods are discussed. The largest reflectivity changes in the 412 nm SW LER and 331 nm EP LER are found to occur near the equator and are associated with a large EI Nino-Southern Oscillation event. Most other changes that have occurred are regional, such as the apparent cloud decrease over northern Europe since 1998. The fractional occurrence (fraction of days) of high reflectivity values over Hudson Bay, Canada (snow/ice and clouds) appears to have decreased when comparing reflectivity data from 1980 to 1992 to 1997-2006, suggesting shorter duration of ice in Hudson Bay since 1980.

  3. Use of machine learning to improve autism screening and diagnostic instruments: effectiveness, efficiency, and multi-instrument fusion

    PubMed Central

    Bone, Daniel; Bishop, Somer; Black, Matthew P.; Goodwin, Matthew S.; Lord, Catherine; Narayanan, Shrikanth S.

    2016-01-01

    Background Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely-used ASD screening and diagnostic tools. Methods The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders (DD), split at age 10. Algorithms were created via a robust ML classifier, support vector machine (SVM), while targeting best-estimate clinical diagnosis of ASD vs. non-ASD. Parameter settings were tuned in multiple levels of cross-validation. Results The created algorithms were more effective (higher performing) than current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. Conclusions ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. PMID:27090613

  4. Use of machine learning to improve autism screening and diagnostic instruments: effectiveness, efficiency, and multi-instrument fusion.

    PubMed

    Bone, Daniel; Bishop, Somer L; Black, Matthew P; Goodwin, Matthew S; Lord, Catherine; Narayanan, Shrikanth S

    2016-08-01

    Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely used ASD screening and diagnostic tools. The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders, split at age 10. Algorithms were created via a robust ML classifier, support vector machine, while targeting best-estimate clinical diagnosis of ASD versus non-ASD. Parameter settings were tuned in multiple levels of cross-validation. The created algorithms were more effective (higher performing) than the current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight the limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. © 2016 Association for Child and Adolescent Mental Health.

  5. An exploration of crowdsourcing citation screening for systematic reviews

    PubMed Central

    Mortensen, Michael L.; Adam, Gaelen P.; Trikalinos, Thomas A.; Kraska, Tim

    2017-01-01

    Systematic reviews are increasingly used to inform health care decisions, but are expensive to produce. We explore the use of crowdsourcing (distributing tasks to untrained workers via the web) to reduce the cost of screening citations. We used Amazon Mechanical Turk as our platform and 4 previously conducted systematic reviews as examples. For each citation, workers answered 4 or 5 questions that were equivalent to the eligibility criteria. We aggregated responses from multiple workers into an overall decision to include or exclude the citation using 1 of 9 algorithms and compared the performance of these algorithms to the corresponding decisions of trained experts. The most inclusive algorithm (designating a citation as relevant if any worker did) identified 95% to 99% of the citations that were ultimately included in the reviews while excluding 68% to 82% of irrelevant citations. Other algorithms increased the fraction of irrelevant articles excluded at some cost to the inclusion of relevant studies. Crowdworkers completed screening in 4 to 17 days, costing $460 to $2220, a cost reduction of up to 88% compared to trained experts. Crowdsourcing may represent a useful approach to reducing the cost of identifying literature for systematic reviews. PMID:28677322

  6. Real-time free-viewpoint DIBR for large-size 3DLED

    NASA Astrophysics Data System (ADS)

    Wang, NengWen; Sang, Xinzhu; Guo, Nan; Wang, Kuiru

    2017-10-01

    Three-dimensional (3D) display technologies make great progress in recent years, and lenticular array based 3D display is a relatively mature technology, which most likely to commercial. In naked-eye-3D display, the screen size is one of the most important factors that affect the viewing experience. In order to construct a large-size naked-eye-3D display system, the LED display is used. However, the pixel misalignment is an inherent defect of the LED screen, which will influences the rendering quality. To address this issue, an efficient image synthesis algorithm is proposed. The Texture-Plus-Depth(T+D) format is chosen for the display content, and the modified Depth Image Based Rendering (DIBR) method is proposed to synthesize new views. In order to achieve realtime, the whole algorithm is implemented on GPU. With the state-of-the-art hardware and the efficient algorithm, a naked-eye-3D display system with a LED screen size of 6m × 1.8m is achieved. Experiment shows that the algorithm can process the 43-view 3D video with 4K × 2K resolution in real time on GPU, and vivid 3D experience is perceived.

  7. An Automatic Image Processing System for Glaucoma Screening

    PubMed Central

    Alodhayb, Sami; Lakshminarayanan, Vasudevan

    2017-01-01

    Horizontal and vertical cup to disc ratios are the most crucial parameters used clinically to detect glaucoma or monitor its progress and are manually evaluated from retinal fundus images of the optic nerve head. Due to the rarity of the glaucoma experts as well as the increasing in glaucoma's population, an automatically calculated horizontal and vertical cup to disc ratios (HCDR and VCDR, resp.) can be useful for glaucoma screening. We report on two algorithms to calculate the HCDR and VCDR. In the algorithms, level set and inpainting techniques were developed for segmenting the disc, while thresholding using Type-II fuzzy approach was developed for segmenting the cup. The results from the algorithms were verified using the manual markings of images from a dataset of glaucomatous images (retinal fundus images for glaucoma analysis (RIGA dataset)) by six ophthalmologists. The algorithm's accuracy for HCDR and VCDR combined was 74.2%. Only the accuracy of manual markings by one ophthalmologist was higher than the algorithm's accuracy. The algorithm's best agreement was with markings by ophthalmologist number 1 in 230 images (41.8%) of the total tested images. PMID:28947898

  8. Mean curvature and texture constrained composite weighted random walk algorithm for optic disc segmentation towards glaucoma screening.

    PubMed

    Panda, Rashmi; Puhan, N B; Panda, Ganapati

    2018-02-01

    Accurate optic disc (OD) segmentation is an important step in obtaining cup-to-disc ratio-based glaucoma screening using fundus imaging. It is a challenging task because of the subtle OD boundary, blood vessel occlusion and intensity inhomogeneity. In this Letter, the authors propose an improved version of the random walk algorithm for OD segmentation to tackle such challenges. The algorithm incorporates the mean curvature and Gabor texture energy features to define the new composite weight function to compute the edge weights. Unlike the deformable model-based OD segmentation techniques, the proposed algorithm remains unaffected by curve initialisation and local energy minima problem. The effectiveness of the proposed method is verified with DRIVE, DIARETDB1, DRISHTI-GS and MESSIDOR database images using the performance measures such as mean absolute distance, overlapping ratio, dice coefficient, sensitivity, specificity and precision. The obtained OD segmentation results and quantitative performance measures show robustness and superiority of the proposed algorithm in handling the complex challenges in OD segmentation.

  9. Haemoglobinopathy diagnosis: algorithms, lessons and pitfalls.

    PubMed

    Bain, Barbara J

    2011-09-01

    Diagnosis of haemoglobinopathies, including thalassaemias, can result from either a clinical suspicion of a disorder of globin chain synthesis or from follow-up of an abnormality detected during screening. Screening may be carried out as part of a well defined screening programme or be an ad hoc or opportunistic test. Screening may be preoperative, neonatal, antenatal, preconceptual, premarriage or targeted at specific groups perceived to be at risk. Screening in the setting of haemoglobinopathies may be directed at optimising management of a disorder by early diagnosis, permitting informed reproductive choice or preventing a serious disorder by offering termination of pregnancy. Diagnostic methods and algorithms will differ according to the setting. As the primary test, high performance liquid chromatography is increasingly used and haemoglobin electrophoresis less so with isoelectric focussing being largely confined to screening programmes and referral centres, particularly in newborns. Capillary electrophoresis is being increasingly used. All these methods permit only a presumptive diagnosis with definitive diagnosis requiring either DNA analysis or protein analysis, for example by tandem mass spectrometry. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Flexible methods for segmentation evaluation: Results from CT-based luggage screening

    PubMed Central

    Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry

    2017-01-01

    BACKGROUND Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms’ behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. OBJECTIVE To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. METHODS We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. RESULTS Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. CONCLUSIONS Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms. PMID:24699346

  11. Accuracy of Referring Provider and Endoscopist Impressions of Colonoscopy Indication.

    PubMed

    Naveed, Mariam; Clary, Meredith; Ahn, Chul; Kubiliun, Nisa; Agrawal, Deepak; Cryer, Byron; Murphy, Caitlin; Singal, Amit G

    2017-07-01

    Background: Referring provider and endoscopist impressions of colonoscopy indication are used for clinical care, reimbursement, and quality reporting decisions; however, the accuracy of these impressions is unknown. This study assessed the sensitivity, specificity, positive and negative predictive value, and overall accuracy of methods to classify colonoscopy indication, including referring provider impression, endoscopist impression, and administrative algorithm compared with gold standard chart review. Methods: We randomly sampled 400 patients undergoing a colonoscopy at a Veterans Affairs health system between January 2010 and December 2010. Referring provider and endoscopist impressions of colonoscopy indication were compared with gold-standard chart review. Indications were classified into 4 mutually exclusive categories: diagnostic, surveillance, high-risk screening, or average-risk screening. Results: Of 400 colonoscopies, 26% were performed for average-risk screening, 7% for high-risk screening, 26% for surveillance, and 41% for diagnostic indications. Accuracy of referring provider and endoscopist impressions of colonoscopy indication were 87% and 84%, respectively, which were significantly higher than that of the administrative algorithm (45%; P <.001 for both). There was substantial agreement between endoscopist and referring provider impressions (κ=0.76). All 3 methods showed high sensitivity (>90%) for determining screening (vs nonscreening) indication, but specificity of the administrative algorithm was lower (40.3%) compared with referring provider (93.7%) and endoscopist (84.0%) impressions. Accuracy of endoscopist, but not referring provider, impression was lower in patients with a family history of colon cancer than in those without (65% vs 84%; P =.001). Conclusions: Referring provider and endoscopist impressions of colonoscopy indication are both accurate and may be useful data to incorporate into algorithms classifying colonoscopy indication. Copyright © 2017 by the National Comprehensive Cancer Network.

  12. How to identify students for school-based depression intervention: can school record review be substituted for universal depression screening?

    PubMed

    Kuo, Elena S; Vander Stoep, Ann; Herting, Jerald R; Grupp, Katherine; McCauley, Elizabeth

    2013-02-01

    Early identification and intervention are critical for reducing the adverse effects of depression on academic and occupational performance. Cost-effective approaches are needed for identifying adolescents at high depression risk. This study evaluated the utility of school record review versus universal school-based depression screening for determining eligibility for an indicated depression intervention program implemented in the middle school setting. Algorithms derived from grades, attendance, suspensions, and basic demographic information were evaluated with regard to their ability to predict students' depression screening scores. The school information-based algorithms proved poor proxies for individual students' depression screening results. However, school records showed promise for identifying low, medium, and high-yield subgroups on the basis of which efficient screening targeting decisions could be made. Study results will help to guide school nurses who coordinate indicated depression intervention programs in school settings as they evaluate options of approaches for determining which students are eligible for participation. © 2012 Wiley Periodicals, Inc.

  13. Syndromic Algorithms for Detection of Gambiense Human African Trypanosomiasis in South Sudan

    PubMed Central

    Palmer, Jennifer J.; Surur, Elizeous I.; Goch, Garang W.; Mayen, Mangar A.; Lindner, Andreas K.; Pittet, Anne; Kasparian, Serena; Checchi, Francesco; Whitty, Christopher J. M.

    2013-01-01

    Background Active screening by mobile teams is considered the best method for detecting human African trypanosomiasis (HAT) caused by Trypanosoma brucei gambiense but the current funding context in many post-conflict countries limits this approach. As an alternative, non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases who need testing based on their symptoms. We explored the predictive value of syndromic referral algorithms to identify symptomatic cases of HAT among a treatment-seeking population in Nimule, South Sudan. Methodology/Principal Findings Symptom data from 462 patients (27 cases) presenting for a HAT test via passive screening over a 7 month period were collected to construct and evaluate over 14,000 four item syndromic algorithms considered simple enough to be used by peripheral HCWs. For comparison, algorithms developed in other settings were also tested on our data, and a panel of expert HAT clinicians were asked to make referral decisions based on the symptom dataset. The best performing algorithms consisted of three core symptoms (sleep problems, neurological problems and weight loss), with or without a history of oedema, cervical adenopathy or proximity to livestock. They had a sensitivity of 88.9–92.6%, a negative predictive value of up to 98.8% and a positive predictive value in this context of 8.4–8.7%. In terms of sensitivity, these out-performed more complex algorithms identified in other studies, as well as the expert panel. The best-performing algorithm is predicted to identify about 9/10 treatment-seeking HAT cases, though only 1/10 patients referred would test positive. Conclusions/Significance In the absence of regular active screening, improving referrals of HAT patients through other means is essential. Systematic use of syndromic algorithms by peripheral HCWs has the potential to increase case detection and would increase their participation in HAT programmes. The algorithms proposed here, though promising, should be validated elsewhere. PMID:23350005

  14. Groundwater Study of the Rocky Mountain Arsenal and Some Surrounding Area, 1974 - 1975

    DTIC Science & Technology

    1975-01-01

    Table 3. From the sampling, Lake F was found to contain a l~er concentration of OCPD than that found in the groundwaters. In addition, very high copper...be the influent area to Lake F. (3) Reclamation of the groundwater for DIMP Is reco..ended. (4) Reclmatlon of OCPD frca, tli, groundwater appears

  15. Effect of Juvenile Pretraining on Adolescent Structural Hippocampal Attributes as a Substrate for Enhanced Spatial Performance

    ERIC Educational Resources Information Center

    Keeley, Robin J.; Wartman, Brianne C.; Hausler, Alexander N.; Holahan, Matthew R.

    2010-01-01

    Research has demonstrated that Long-Evans rats (LER) display superior mnemonic function over Wistar rats (WR). These differences are correlated with endogenous and input-dependent properties of the hippocampus. The present work sought to determine if juvenile pretraining might enhance hippocampal structural markers and if this would be associated…

  16. The impact of different algorithms for ideal body weight on screening for hydroxychloroquine retinopathy in women.

    PubMed

    Browning, David J; Lee, Chong; Rotberg, David

    2014-01-01

    To determine how algorithms for ideal body weight (IBW) affect hydroxychloroquine dosing in women. This was a retrospective study of 520 patients screened for hydroxychloroquine retinopathy. Charts were reviewed for sex, height, weight, and daily dose. The outcome measures were ranges of IBW across algorithms; rates of potentially toxic dosing; height thresholds below which 400 mg/d dosing is potentially toxic; and rates for which actual body weight (ABW) was less than IBW. Women made up 474 (91%) of the patients. The IBWs for a height varied from 30-34 pounds (13.6-15.5 kg) across algorithms. The threshold heights below which toxic dosing occurred varied from 62-70 inches (157.5-177.8 cm). Different algorithms placed 16%-98% of women in the toxic dosing range. The proportion for whom dosing should have been based on ABW rather than IBW ranged from 5%-31% across algorithms. Although hydroxychloroquine dosing should be based on the lesser of ABW and IBW, there is no consensus about the definition of IBW. The Michaelides algorithm is associated with the most frequent need to adjust dosing; the Metropolitan Life Insurance, large frame, mean value table with the least frequent need. No evidence indicates that one algorithm is superior to others.

  17. Evaluation of the Boson Chemiluminescence Immunoassay as a First-Line Screening Test in the ECDC Algorithm for Syphilis Serodiagnosis in a Population with a High Prevalence of Syphilis

    PubMed Central

    Qiu, Xin-Hui; Zhang, Ya-Feng; Chen, Yu-Yan; Zhang, Qiao; Chen, Fu-Yi; Liu, Long; Fan, Jin-Yi; Gao, Kun; Zhu, Xiao-Zhen; Zheng, Wei-Hong; Zhang, Hui-Lin; Lin, Li-Rong; Liu, Li-Li; Tong, Man-Li; Zhang, Chang-Gong

    2015-01-01

    We developed a new Boson chemiluminescence immunoassay (CIA) and evaluated its application with cross-sectional analyses. Our results indicated that the Boson CIA demonstrated strong discriminatory power in diagnosing syphilis and that it can be used as a first-line screening test for syphilis serodiagnosis using the European Centre for Disease Prevention and Control algorithm or as a confirmatory test when combined with a patient's clinical history. PMID:25631792

  18. Boiling Heat-Transfer Processes and Their Application in the Cooling of High Heat Flux Devices

    DTIC Science & Technology

    1993-06-01

    1991, pp. 395-397. 385. Galloway, J. E. and Mudawar , 1. "Critical Heat Flux Enhancement by Means of Liquid Subcooling and Centrifugal Force Induced...Flow Boiling Heat Transfer for a Spirally Fluted Tube." Heat Tran~ler Engineering, Vol. 13, No.1, 1992, pp. 42-52. 390. Willingham, T. C. and Mudawar

  19. Educating for Landpower

    DTIC Science & Technology

    2009-03-23

    UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Captain Albert Lord Department of Military Strategy, Planning, and Operations 8...RESEARCH PROJECT EDUCATING FOR LANDPOWER by Lieutenant Colonel Michael S. Lewis United States Army Captain Albert Lord Project Adviser This SRP is...Napoleon ler; Publiee par ordre de l’empereur Napoleon III, 1858-1869, [Herafter Correspondance.], III, 2392 “Au President de L’Institut National [ Camus

  20. Bid Responsiveness and the Acceptable Nonconforming Bid.

    DTIC Science & Technology

    1979-09-30

    the. fuidaIenta prin ciples of responsiveness. The Conhl trot ler, hnwever, has ta ken this sairn "reasonable- tit’, ’ t. ,t ’’ , T (I I e l iLd i I...cvo rcome otherwise material deviation i f; the bid, thereby renderin gj it a(,ceptable. Tne Comptroller General’s o) p loach to the , i..sue ft

  1. Computer Assisted Learning for Biomedical Engineering Education: Tools

    DTIC Science & Technology

    2001-10-25

    COMPUTER ASSISTED LEARNING FOR BIOMEDICAL ENGINEERING EDUCATION : TOOLS Ayhan ÝSTANBULLU1 Ýnan GÜLER2 1 Department of Electronic...of Technical Education , Gazi University, 06500 Ankara, Türkiye Abstract- Interactive multimedia learning environment is being proposed...Assisted Learning (CAL) are given and some tools used in this area are explained. Together with the developments in the area of distance education

  2. Vitamin D Levels and Related Genetic Polymorphisms, Sun Exposure, Skin Color, and Risk of Aggressive Prostate Cancer

    DTIC Science & Technology

    2012-07-01

    for our specific aims. 15. SUBJECT TERMS vitamin D deficiency, prostate cancer, health disparities, African American 16. SECURITY CLASSIFICATION OF...ler.reb (If - 0.43 wen aJECia6!d with ~ iota re.icleaa~ in tbe lropi.:. (~- 5.60, p- 114), &t:.ry vibnoio D iobsb @- 0.01, p- ,OJ); IIILd supplmu1Ji

  3. Increased Productivity of a Cover Crop Mixture Is Not Associated with Enhanced Agroecosystem Services

    PubMed Central

    Smith, Richard G.; Atwood, Lesley W.; Warren, Nicholas D.

    2014-01-01

    Cover crops provide a variety of important agroecological services within cropping systems. Typically these crops are grown as monocultures or simple graminoid-legume bicultures; however, ecological theory and empirical evidence suggest that agroecosystem services could be enhanced by growing cover crops in species-rich mixtures. We examined cover crop productivity, weed suppression, stability, and carryover effects to a subsequent cash crop in an experiment involving a five-species annual cover crop mixture and the component species grown as monocultures in SE New Hampshire, USA in 2011 and 2012. The mean land equivalent ratio (LER) for the mixture exceeded 1.0 in both years, indicating that the mixture over-yielded relative to the monocultures. Despite the apparent over-yielding in the mixture, we observed no enhancement in weed suppression, biomass stability, or productivity of a subsequent oat (Avena sativa L.) cash crop when compared to the best monoculture component crop. These data are some of the first to include application of the LER to an analysis of a cover crop mixture and contribute to the growing literature on the agroecological effects of cover crop diversity in cropping systems. PMID:24847902

  4. Development of miniaturized light endoscope-holder robot for laparoscopic surgery.

    PubMed

    Long, Jean-Alexandre; Cinquin, Philippe; Troccaz, Jocelyne; Voros, Sandrine; Berkelman, Peter; Descotes, Jean-Luc; Letoublon, Christian; Rambeaud, Jean-Jacques

    2007-08-01

    We have conducted experiments with an innovatively designed robot endoscope holder for laparoscopic surgery that is small and low cost. A compact light endoscope robot (LER) that is placed on the patient's skin and can be used with the patient in the lateral or dorsal supine position was tested on cadavers and laboratory pigs in order to allow successive modifications. The current control system is based on voice recognition. The range of vision is 360 degrees with an angle of 160 degrees . Twenty-three procedures were performed. The tests made it possible to advance the prototype on a variety of aspects, including reliability, steadiness, ergonomics, and dimensions. The ease of installation of the robot, which takes only 5 minutes, and the easy handling made it possible for 21 of the 23 procedures to be performed without an assistant. The LER is a camera holder guided by the surgeon's voice that can eliminate the need for an assistant during laparoscopic surgery. The ease of installation and manufacture should make it an effective and inexpensive system for use on patients in the lateral and dorsal supine positions. Randomized clinical trials will soon validate a new version of this robot prior to marketing.

  5. Development and growth of several strains of Arabidopsis seedlings in microgravity

    NASA Technical Reports Server (NTRS)

    Kiss, J. Z.; Brinckmann, E.; Brillouet, C.

    2000-01-01

    Growth and development of dark-grown Arabidopsis thaliana seedlings were studied in microgravity during space shuttle mission STS-84. The major purpose of this project was to determine if there were developmental differences among the four ecotypes studied--Wassilewskija (Ws), Columbia (Col), Landsberg erecta (Ler), and C24--and to evaluate whether particular ecotypes are better suited for spaceflight experimentation compared with others. A secondary goal was to study the growth of three starch-deficient strains of Arabidopsis by extending the observations made in a previously published report. For all strains, seed germination was not affected by microgravity, but seedlings were smaller in the spaceflight samples compared with the ground controls. The starch-deficient strains continued to exhibit vigorous growth until the termination of the experiment at 121 h after imbibition of seeds. However, ethylene effects, i.e., reduced growth and exaggerated hypocotyl hooks, were observed in all strains studied. Nevertheless, the Ler and C24 ecotypes seem to be more suitable for spaceflight research, compared with the other two ecotypes, based on measurements of their relative and absolute growth. This type of information should aid in the design of plant experiments for the International Space Station.

  6. Study of nanoimprint lithography (NIL) for HVM of memory devices

    NASA Astrophysics Data System (ADS)

    Kono, Takuya; Hatano, Masayuki; Tokue, Hiroshi; Kobayashi, Kei; Suzuki, Masato; Fukuhara, Kazuya; Asano, Masafumi; Nakasugi, Tetsuro; Choi, Eun Hyuk; Jung, Wooyung

    2017-03-01

    A low cost alternative lithographic technology is desired to meet the decreasing feature size of semiconductor devices. Nano-imprint lithography (NIL) is one of the candidates for alternative lithographic technologies.[1][2][3] NIL has such advantages as good resolution, critical dimension (CD) uniformity and low line edge roughness (LER). On the other hand, the critical issues of NIL are defectivity, overlay, and throughput. In order to introduce NIL into the HVM, it is necessary to overcome these three challenges simultaneously.[4]-[12] In our previous study, we have reported a dramatic improvement in NIL process defectivity on a pilot line tool, FPA-1100 NZ2. We have described that the NIL process for 2x nm half pitch is getting closer to the target of HVM.[12] In this study, we report the recent evaluation of the NIL process performance to judge the applicability of NIL to memory device fabrications. In detail, the CD uniformity and LER are found to be less than 2nm. The overlay accuracy of the test device is less than 7nm. A defectivity level of below 1pcs./cm2 has been achieved at a throughput of 15 wafers per hour.

  7. Introduction of pre-etch deposition techniques in EUV patterning

    NASA Astrophysics Data System (ADS)

    Xiang, Xun; Beique, Genevieve; Sun, Lei; Labonte, Andre; Labelle, Catherine; Nagabhirava, Bhaskar; Friddle, Phil; Schmitz, Stefan; Goss, Michael; Metzler, Dominik; Arnold, John

    2018-04-01

    The thin nature of EUV (Extreme Ultraviolet) resist has posed significant challenges for etch processes. In particular, EUV patterning combined with conventional etch approaches suffers from loss of pattern fidelity in the form of line breaks. A typical conventional etch approach prevents the etch process from having sufficient resist margin to control the trench CD (Critical Dimension), minimize the LWR (Line Width Roughness), LER (Line Edge Roughness) and reduce the T2T (Tip-to-Tip). Pre-etch deposition increases the resist budget by adding additional material to the resist layer, thus enabling the etch process to explore a wider set of process parameters to achieve better pattern fidelity. Preliminary tests with pre-etch deposition resulted in blocked isolated trenches. In order to mitigate these effects, a cyclic deposition and etch technique is proposed. With optimization of deposition and etch cycle time as well as total number of cycles, it is possible to open the underlying layers with a beneficial over etch and simultaneously keep the isolated trenches open. This study compares the impact of no pre-etch deposition, one time deposition and cyclic deposition/etch techniques on 4 aspects: resist budget, isolated trench open, LWR/LER and T2T.

  8. Chaotic dynamics outside Saturn’s main rings: The case of Atlas

    NASA Astrophysics Data System (ADS)

    Renner, Stéfan; Cooper, Nicholas J.; El Moutamid, Maryame; Evans, Mike W.; Murray, Carl D.; Sicardy, Bruno

    2014-11-01

    We revisit in detail the dynamics of Atlas. From a fit to new Cassini ISS astrometric observations spanning February 2004 to August 2013, we estimate GM_Atlas=0.384+/-0.001 x 10^(-3)km^3s^(-2), a value 13% smaller than the previously published estimate but with an order of magnitude reduction in the uncertainty. Our numerically-derived orbit shows that Atlas is currently librating in both a 54:53 corotation eccentricity resonance (CER) and a 54:53 Lindblad eccentricity resonance (LER) with Prometheus. We demonstrate that the orbit of Atlas is chaotic, with a Lyapunov time of order 10 years, as a direct consequence of the coupled resonant interaction (CER/LER) with Prometheus. The interactions between the two resonances is investigated using the CoraLin analytical model (El Moutamid et al., 2014), showing that the chaotic zone fills almost all the corotation site occupied by the satellite’s orbit. Four 70 :67 apse-type mean motion resonances with Pandora are also overlapping, but these resonances have a much weaker effect on Atlas.We estimate the capture probabilities of Atlas into resonances with Prometheus as the orbits expand through tidal effects, and discuss the implications for the orbital evolution.

  9. Hierarchical group testing for multiple infections.

    PubMed

    Hou, Peijie; Tebbs, Joshua M; Bilder, Christopher R; McMahan, Christopher S

    2017-06-01

    Group testing, where individuals are tested initially in pools, is widely used to screen a large number of individuals for rare diseases. Triggered by the recent development of assays that detect multiple infections at once, screening programs now involve testing individuals in pools for multiple infections simultaneously. Tebbs, McMahan, and Bilder (2013, Biometrics) recently evaluated the performance of a two-stage hierarchical algorithm used to screen for chlamydia and gonorrhea as part of the Infertility Prevention Project in the United States. In this article, we generalize this work to accommodate a larger number of stages. To derive the operating characteristics of higher-stage hierarchical algorithms with more than one infection, we view the pool decoding process as a time-inhomogeneous, finite-state Markov chain. Taking this conceptualization enables us to derive closed-form expressions for the expected number of tests and classification accuracy rates in terms of transition probability matrices. When applied to chlamydia and gonorrhea testing data from four states (Region X of the United States Department of Health and Human Services), higher-stage hierarchical algorithms provide, on average, an estimated 11% reduction in the number of tests when compared to two-stage algorithms. For applications with rarer infections, we show theoretically that this percentage reduction can be much larger. © 2016, The International Biometric Society.

  10. Hierarchical group testing for multiple infections

    PubMed Central

    Hou, Peijie; Tebbs, Joshua M.; Bilder, Christopher R.; McMahan, Christopher S.

    2016-01-01

    Summary Group testing, where individuals are tested initially in pools, is widely used to screen a large number of individuals for rare diseases. Triggered by the recent development of assays that detect multiple infections at once, screening programs now involve testing individuals in pools for multiple infections simultaneously. Tebbs, McMahan, and Bilder (2013, Biometrics) recently evaluated the performance of a two-stage hierarchical algorithm used to screen for chlamydia and gonorrhea as part of the Infertility Prevention Project in the United States. In this article, we generalize this work to accommodate a larger number of stages. To derive the operating characteristics of higher-stage hierarchical algorithms with more than one infection, we view the pool decoding process as a time-inhomogeneous, finite-state Markov chain. Taking this conceptualization enables us to derive closed-form expressions for the expected number of tests and classification accuracy rates in terms of transition probability matrices. When applied to chlamydia and gonorrhea testing data from four states (Region X of the United States Department of Health and Human Services), higher-stage hierarchical algorithms provide, on average, an estimated 11 percent reduction in the number of tests when compared to two-stage algorithms. For applications with rarer infections, we show theoretically that this percentage reduction can be much larger. PMID:27657666

  11. Feasibility and cost-effectiveness of stroke prevention through community screening for atrial fibrillation using iPhone ECG in pharmacies. The SEARCH-AF study.

    PubMed

    Lowres, Nicole; Neubeck, Lis; Salkeld, Glenn; Krass, Ines; McLachlan, Andrew J; Redfern, Julie; Bennett, Alexandra A; Briffa, Tom; Bauman, Adrian; Martinez, Carlos; Wallenhorst, Christopher; Lau, Jerrett K; Brieger, David B; Sy, Raymond W; Freedman, S Ben

    2014-06-01

    Atrial fibrillation (AF) causes a third of all strokes, but often goes undetected before stroke. Identification of unknown AF in the community and subsequent anti-thrombotic treatment could reduce stroke burden. We investigated community screening for unknown AF using an iPhone electrocardiogram (iECG) in pharmacies, and determined the cost-effectiveness of this strategy.Pharmacists performedpulse palpation and iECG recordings, with cardiologist iECG over-reading. General practitioner review/12-lead ECG was facilitated for suspected new AF. An automated AF algorithm was retrospectively applied to collected iECGs. Cost-effectiveness analysis incorporated costs of iECG screening, and treatment/outcome data from a United Kingdom cohort of 5,555 patients with incidentally detected asymptomatic AF. A total of 1,000 pharmacy customers aged ≥65 years (mean 76 ± 7 years; 44% male) were screened. Newly identified AF was found in 1.5% (95% CI, 0.8-2.5%); mean age 79 ± 6 years; all had CHA2DS2-VASc score ≥2. AF prevalence was 6.7% (67/1,000). The automated iECG algorithm showed 98.5% (CI, 92-100%) sensitivity for AF detection and 91.4% (CI, 89-93%) specificity. The incremental cost-effectiveness ratio of extending iECG screening into the community, based on 55% warfarin prescription adherence, would be $AUD5,988 (€3,142; $USD4,066) per Quality Adjusted Life Year gained and $AUD30,481 (€15,993; $USD20,695) for preventing one stroke. Sensitivity analysis indicated cost-effectiveness improved with increased treatment adherence.Screening with iECG in pharmacies with an automated algorithm is both feasible and cost-effective. The high and largely preventable stroke/thromboembolism risk of those with newly identified AF highlights the likely benefits of community AF screening. Guideline recommendation of community iECG AF screening should be considered.

  12. Halftoning Algorithms and Systems.

    DTIC Science & Technology

    1996-08-01

    TERMS 15. NUMBER IF PAGESi. Halftoning algorithms; error diffusions ; color printing; topographic maps 16. PRICE CODE 17. SECURITY CLASSIFICATION 18...graylevels for each screen level. In the case of error diffusion algorithms, the calibration procedure using the new centering concept manifests itself as a...Novel Centering Concept for Overlapping Correction Paper / Transparency (Patent Applied 5/94)I * Applications To Error Diffusion * To Dithering (IS&T

  13. Steroidogenic Factor 1, Pit-1, and Adrenocorticotropic Hormone: A Rational Starting Place for the Immunohistochemical Characterization of Pituitary Adenoma.

    PubMed

    McDonald, William C; Banerji, Nilanjana; McDonald, Kelsey N; Ho, Bridget; Macias, Virgilia; Kajdacsy-Balla, Andre

    2017-01-01

    -Pituitary adenoma classification is complex, and diagnostic strategies vary greatly from laboratory to laboratory. No optimal diagnostic algorithm has been defined. -To develop a panel of immunohistochemical (IHC) stains that provides the optimal combination of cost, accuracy, and ease of use. -We examined 136 pituitary adenomas with stains of steroidogenic factor 1 (SF-1), Pit-1, anterior pituitary hormones, cytokeratin CAM5.2, and α subunit of human chorionic gonadotropin. Immunohistochemical staining was scored using the Allred system. Adenomas were assigned to a gold standard class based on IHC results and available clinical and serologic information. Correlation and cluster analyses were used to develop an algorithm for parsimoniously classifying adenomas. -The algorithm entailed a 1- or 2-step process: (1) a screening step consisting of IHC stains for SF-1, Pit-1, and adrenocorticotropic hormone; and (2) when screening IHC pattern and clinical history were not clearly gonadotrophic (SF-1 positive only), corticotrophic (adrenocorticotropic hormone positive only), or IHC null cell (negative-screening IHC), we subsequently used IHC for prolactin, growth hormone, thyroid-stimulating hormone, and cytokeratin CAM5.2. -Comparison between diagnoses generated by our algorithm and the gold standard diagnoses showed excellent agreement. When compared with a commonly used panel using 6 IHC for anterior pituitary hormones plus IHC for a low-molecular-weight cytokeratin in certain tumors, our algorithm uses approximately one-third fewer IHC stains and detects gonadotroph adenomas with greater sensitivity.

  14. Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.

    PubMed

    Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko

    2018-05-04

    Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling | Office of Cancer Genomics

    Cancer.gov

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.

  16. The Usefulness of the DBC-ASA as a Screening Instrument for Autism in Children with Intellectual Disabilities: A Pilot Study

    ERIC Educational Resources Information Center

    Deb, Shoumitro; Dhaliwal, Akal-Joat; Roy, Meera

    2009-01-01

    Aims: To explore the validity of Developmental Behaviour Checklist-Autism Screening Algorithm (DBC-ASA) as a screening instrument for autism among children with intellectual disabilities. Method: Data were collected from the case notes of 109 children with intellectual disabilities attending a specialist clinic in the UK. Results: The mean score…

  17. Diagnostic accuracy of administrative data algorithms in the diagnosis of osteoarthritis: a systematic review.

    PubMed

    Shrestha, Swastina; Dave, Amish J; Losina, Elena; Katz, Jeffrey N

    2016-07-07

    Administrative health care data are frequently used to study disease burden and treatment outcomes in many conditions including osteoarthritis (OA). OA is a chronic condition with significant disease burden affecting over 27 million adults in the US. There are few studies examining the performance of administrative data algorithms to diagnose OA. The purpose of this study is to perform a systematic review of administrative data algorithms for OA diagnosis; and, to evaluate the diagnostic characteristics of algorithms based on restrictiveness and reference standards. Two reviewers independently screened English-language articles published in Medline, Embase, PubMed, and Cochrane databases that used administrative data to identify OA cases. Each algorithm was classified as restrictive or less restrictive based on number and type of administrative codes required to satisfy the case definition. We recorded sensitivity and specificity of algorithms and calculated positive likelihood ratio (LR+) and positive predictive value (PPV) based on assumed OA prevalence of 0.1, 0.25, and 0.50. The search identified 7 studies that used 13 algorithms. Of these 13 algorithms, 5 were classified as restrictive and 8 as less restrictive. Restrictive algorithms had lower median sensitivity and higher median specificity compared to less restrictive algorithms when reference standards were self-report and American college of Rheumatology (ACR) criteria. The algorithms compared to reference standard of physician diagnosis had higher sensitivity and specificity than those compared to self-reported diagnosis or ACR criteria. Restrictive algorithms are more specific for OA diagnosis and can be used to identify cases when false positives have higher costs e.g. interventional studies. Less restrictive algorithms are more sensitive and suited for studies that attempt to identify all cases e.g. screening programs.

  18. [Diagnostic work-up of pulmonary nodules : Management of pulmonary nodules detected with low‑dose CT screening].

    PubMed

    Wormanns, D

    2016-09-01

    Pulmonary nodules are the most frequent pathological finding in low-dose computed tomography (CT) scanning for early detection of lung cancer. Early stages of lung cancer are often manifested as pulmonary nodules; however, the very commonly occurring small nodules are predominantly benign. These benign nodules are responsible for the high percentage of false positive test results in screening studies. Appropriate diagnostic algorithms are necessary to reduce false positive screening results and to improve the specificity of lung cancer screening. Such algorithms are based on some of the basic principles comprehensively described in this article. Firstly, the diameter of nodules allows a differentiation between large (>8 mm) probably malignant and small (<8 mm) probably benign nodules. Secondly, some morphological features of pulmonary nodules in CT can prove their benign nature. Thirdly, growth of small nodules is the best non-invasive predictor of malignancy and is utilized as a trigger for further diagnostic work-up. Non-invasive testing using positron emission tomography (PET) and contrast enhancement as well as invasive diagnostic tests (e.g. various procedures for cytological and histological diagnostics) are briefly described in this article. Different nodule morphology using CT (e.g. solid and semisolid nodules) is associated with different biological behavior and different algorithms for follow-up are required. Currently, no obligatory algorithm is available in German-speaking countries for the management of pulmonary nodules, which reflects the current state of knowledge. The main features of some international and American recommendations are briefly presented in this article from which conclusions for the daily clinical use are derived.

  19. Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices.

    PubMed

    Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam

    2017-12-01

    Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MS E mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. Graphical Abstract.

  20. Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices

    NASA Astrophysics Data System (ADS)

    Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam

    2017-12-01

    Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MSE mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. [Figure not available: see fulltext.

  1. High-Content, High-Throughput Screening for the Identification of Cytotoxic Compounds Based on Cell Morphology and Cell Proliferation Markers

    PubMed Central

    Martin, Heather L.; Adams, Matthew; Higgins, Julie; Bond, Jacquelyn; Morrison, Ewan E.; Bell, Sandra M.; Warriner, Stuart; Nelson, Adam; Tomlinson, Darren C.

    2014-01-01

    Toxicity is a major cause of failure in drug discovery and development, and whilst robust toxicological testing occurs, efficiency could be improved if compounds with cytotoxic characteristics were identified during primary compound screening. The use of high-content imaging in primary screening is becoming more widespread, and by utilising phenotypic approaches it should be possible to incorporate cytotoxicity counter-screens into primary screens. Here we present a novel phenotypic assay that can be used as a counter-screen to identify compounds with adverse cellular effects. This assay has been developed using U2OS cells, the PerkinElmer Operetta high-content/high-throughput imaging system and Columbus image analysis software. In Columbus, algorithms were devised to identify changes in nuclear morphology, cell shape and proliferation using DAPI, TOTO-3 and phosphohistone H3 staining, respectively. The algorithms were developed and tested on cells treated with doxorubicin, taxol and nocodazole. The assay was then used to screen a novel, chemical library, rich in natural product-like molecules of over 300 compounds, 13.6% of which were identified as having adverse cellular effects. This assay provides a relatively cheap and rapid approach for identifying compounds with adverse cellular effects during screening assays, potentially reducing compound rejection due to toxicity in subsequent in vitro and in vivo assays. PMID:24505478

  2. External Validity of Electronic Sniffers for Automated Recognition of Acute Respiratory Distress Syndrome.

    PubMed

    McKown, Andrew C; Brown, Ryan M; Ware, Lorraine B; Wanderer, Jonathan P

    2017-01-01

    Automated electronic sniffers may be useful for early detection of acute respiratory distress syndrome (ARDS) for institution of treatment or clinical trial screening. In a prospective cohort of 2929 critically ill patients, we retrospectively applied published sniffer algorithms for automated detection of acute lung injury to assess their utility in diagnosis of ARDS in the first 4 ICU days. Radiographic full-text reports were searched for "edema" OR ("bilateral" AND "infiltrate") and a more detailed algorithm for descriptions consistent with ARDS. Patients were flagged as possible ARDS if a radiograph met search criteria and had a PaO 2 /FiO 2 or SpO 2 /FiO 2 of 300 or 315, respectively. Test characteristics of the electronic sniffers and clinical suspicion of ARDS were compared to a gold standard of 2-physician adjudicated ARDS. Thirty percent of 2841 patients included in the analysis had gold standard diagnosis of ARDS. The simpler algorithm had sensitivity for ARDS of 78.9%, specificity of 52%, positive predictive value (PPV) of 41%, and negative predictive value (NPV) of 85.3% over the 4-day study period. The more detailed algorithm had sensitivity of 88.2%, specificity of 55.4%, PPV of 45.6%, and NPV of 91.7%. Both algorithms were more sensitive but less specific than clinician suspicion, which had sensitivity of 40.7%, specificity of 94.8%, PPV of 78.2%, and NPV of 77.7%. Published electronic sniffer algorithms for ARDS may be useful automated screening tools for ARDS and improve on clinical recognition, but they are limited to screening rather than diagnosis because their specificity is poor.

  3. Automated Cervical Screening and Triage, Based on HPV Testing and Computer-Interpreted Cytology.

    PubMed

    Yu, Kai; Hyun, Noorie; Fetterman, Barbara; Lorey, Thomas; Raine-Bennett, Tina R; Zhang, Han; Stamps, Robin E; Poitras, Nancy E; Wheeler, William; Befano, Brian; Gage, Julia C; Castle, Philip E; Wentzensen, Nicolas; Schiffman, Mark

    2018-04-11

    State-of-the-art cervical cancer prevention includes human papillomavirus (HPV) vaccination among adolescents and screening/treatment of cervical precancer (CIN3/AIS and, less strictly, CIN2) among adults. HPV testing provides sensitive detection of precancer but, to reduce overtreatment, secondary "triage" is needed to predict women at highest risk. Those with the highest-risk HPV types or abnormal cytology are commonly referred to colposcopy; however, expert cytology services are critically lacking in many regions. To permit completely automatable cervical screening/triage, we designed and validated a novel triage method, a cytologic risk score algorithm based on computer-scanned liquid-based slide features (FocalPoint, BD, Burlington, NC). We compared it with abnormal cytology in predicting precancer among 1839 women testing HPV positive (HC2, Qiagen, Germantown, MD) in 2010 at Kaiser Permanente Northern California (KPNC). Precancer outcomes were ascertained by record linkage. As additional validation, we compared the algorithm prospectively with cytology results among 243 807 women screened at KPNC (2016-2017). All statistical tests were two-sided. Among HPV-positive women, the algorithm matched the triage performance of abnormal cytology. Combined with HPV16/18/45 typing (Onclarity, BD, Sparks, MD), the automatable strategy referred 91.7% of HPV-positive CIN3/AIS cases to immediate colposcopy while deferring 38.4% of all HPV-positive women to one-year retesting (compared with 89.1% and 37.4%, respectively, for typing and cytology triage). In the 2016-2017 validation, the predicted risk scores strongly correlated with cytology (P < .001). High-quality cervical screening and triage performance is achievable using this completely automated approach. Automated technology could permit extension of high-quality cervical screening/triage coverage to currently underserved regions.

  4. Evaluation of the boson chemiluminescence immunoassay as a first-line screening test in the ECDC algorithm for syphilis serodiagnosis in a population with a high prevalence of syphilis.

    PubMed

    Qiu, Xin-Hui; Zhang, Ya-Feng; Chen, Yu-Yan; Zhang, Qiao; Chen, Fu-Yi; Liu, Long; Fan, Jin-Yi; Gao, Kun; Zhu, Xiao-Zhen; Zheng, Wei-Hong; Zhang, Hui-Lin; Lin, Li-Rong; Liu, Li-Li; Tong, Man-Li; Zhang, Chang-Gong; Niu, Jian-Jun; Yang, Tian-Ci

    2015-04-01

    We developed a new Boson chemiluminescence immunoassay (CIA) and evaluated its application with cross-sectional analyses. Our results indicated that the Boson CIA demonstrated strong discriminatory power in diagnosing syphilis and that it can be used as a first-line screening test for syphilis serodiagnosis using the European Centre for Disease Prevention and Control algorithm or as a confirmatory test when combined with a patient's clinical history. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. ecode - Electron Transport Algorithm Testing v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene

    2016-10-05

    ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less

  6. Building a medical image processing algorithm verification database

    NASA Astrophysics Data System (ADS)

    Brown, C. Wayne

    2000-06-01

    The design of a database containing head Computed Tomography (CT) studies is presented, along with a justification for the database's composition. The database will be used to validate software algorithms that screen normal head CT studies from studies that contain pathology. The database is designed to have the following major properties: (1) a size sufficient for statistical viability, (2) inclusion of both normal (no pathology) and abnormal scans, (3) inclusion of scans due to equipment malfunction, technologist error, and uncooperative patients, (4) inclusion of data sets from multiple scanner manufacturers, (5) inclusion of data sets from different gender and age groups, and (6) three independent diagnosis of each data set. Designed correctly, the database will provide a partial basis for FDA (United States Food and Drug Administration) approval of image processing algorithms for clinical use. Our goal for the database is the proof of viability of screening head CT's for normal anatomy using computer algorithms. To put this work into context, a classification scheme for 'computer aided diagnosis' systems is proposed.

  7. Clinical study of quantitative diagnosis of early cervical cancer based on the classification of acetowhitening kinetics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.

    2010-03-01

    A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.

  8. JOURNAL CLUB: Plagiarism in Manuscripts Submitted to the AJR: Development of an Optimal Screening Algorithm and Management Pathways.

    PubMed

    Taylor, Donna B

    2017-04-01

    The objective of this study was to investigate the incidence of plagiarism in a sample of manuscripts submitted to the AJR using CrossCheck, develop an algorithm to identify significant plagiarism, and formulate management pathways. A sample of 110 of 1610 (6.8%) manuscripts submitted to AJR in 2014 in the categories of Original Research or Review were analyzed using CrossCheck and manual assessment. The overall similarity index (OSI), highest similarity score from a single source, whether duplication was from single or multiple origins, journal section, and presence or absence of referencing the source were recorded. The criteria outlined by the International Committee of Medical Journal Editors were the reference standard for identifying manuscripts containing plagiarism. Statistical analysis was used to develop a screening algorithm to maximize sensitivity and specificity for the detection of plagiarism. Criteria for defining the severity of plagiarism and management pathways based on the severity of the plagiarism were determined. Twelve manuscripts (10.9%) contained plagiarism. Nine had an OSI excluding quotations and references of less than 20%. In seven, the highest similarity score from a single source was less than 10%. The highest similarity score from a single source was the work of the same author or authors in nine. Common sections for duplication were the Materials and Methods, Discussion, and abstract. Referencing the original source was lacking in 11. Plagiarism was undetected at submission in five of these 12 articles; two had been accepted for publication. The most effective screening algorithm was to average the OSI including quotations and references and the highest similarity score from a single source and to submit manuscripts with an average value of more than 12% for further review. The current methods for detecting plagiarism are suboptimal. A new screening algorithm is proposed.

  9. Translations on Eastern Europe, Scientific Affairs, Number 567

    DTIC Science & Technology

    1977-12-16

    becomes effective on the day of its promulgation. [signed by] Ferenc Marta general secretary 2542 CSO: 2502 11 HUNGARY ACADEMY ESTABLISHES...doctor of medical sciences, Dezso Schüler, doctor of medical sciences, Gabor Szabo , academician, Jozsef Szegi, doctor of agricultural sciences, Pal...were: Marta Dery, doctor of technical sciences, L. Gyprgy Nagy and Sandor Rohrsetzer, doctors of chemical sciences. II. The Committee of

  10. Installation Restoration Program. Phase I. Records Search for Air Force Reserve and Air National Guard Facilities at General Billy Mitchell Field, Milwaukee, Wisconsin.

    DTIC Science & Technology

    1984-11-01

    quantities of vegetative grco th thr¢cu:ho r ts area. Infiltration of runoff and drainage waLers wuuld o icu n . this area. 3.3.2 SurfEa:- Waer Quality...Wisconsin 608-266-7809 Kevin Kessler Groundwater Coordinator Wisconsin Department of Natural Resources Madison, Wisconsin 608-267-9350 Frank Schultz

  11. Occupationally Based Disaster Medicine

    DTIC Science & Technology

    2011-01-01

    and thai bener r ’~tI h~ may re~ult from the a signmem of a prcvi- (luslv trained and drillc:d disa ler Lrc::almen l Icam that can Lake uvcr in al l...they wi ll sup >rv ise tho~e teams and are rt’ ~ pon~ible, ill ong wilh (lccupillional he lth personnel, for Ihe a fet y a nd heallh ()f th e tea

  12. The Distributed Air Wing

    DTIC Science & Technology

    2014-06-01

    Cruise Missile LCS Littoral Combat Ship LEO Low Earth Orbit LER Loss-Exchange-Ration LHA Landing Helicopter Assault LIDAR Laser Imaging Detection and...Ranging LOC Lines of Communication LP Linear Programming LRASM Long Range Anti-Ship Missile LT Long Ton MANA Map-Aware Non-uniform Automata ME...enemy’s spy satellites. Based on open source information, China currently has 25 satellites operating in Low Earth Orbit ( LEO ), each operates at an

  13. The serological screening of deceased tissue donors within the English Blood Service for infectious agents--a review of current outcomes and a more effective strategy for the future.

    PubMed

    Kitchen, A D; Gillan, H L

    2010-04-01

    The overall effectiveness of the NHSBT screening programme for infectious agents in deceased tissue donors is examined and evaluated in terms of current outcomes and how to improve upon these outcomes. The screening results and any subsequent confirmatory results from a total of 1659 samples from NHSBT deceased donors referred to NTMRL for screening for infectious agents were included in the analysis. Overall 1566/1659 (94.4%) of the samples were screen negative. A total of 93 were repeat reactive on screening for one or more of the mandatory markers screened for, of which only 12 (13%) were subsequently confirmed to be positive on confirmatory testing. The majority of the repeat reactive samples were demonstrating non-specific reactivity with the screening assays in use. Overall, the NHSBT screening programme for infectious agents in deceased tissue donors is very effective with a relatively low overall loss of donors because of non-specific reactivity. However, unnecessary loss of tissue products is not acceptable, and although this programme compares favourably with the outcomes of other such programmes, the confirmatory results obtained demonstrate both the need and the potential for improving the outcomes. This is particularly important as one donor may donate more than one product, and can be achieved very easily with a change to the screening algorithm followed, using the confirmatory data obtained to support and validate this change. CONTENTS SUMMARY: Critical analysis of the NHSBT screening programme for infectious agents in deceased tissue donors and a strategy involving the design and use of a different screening algorithm to improve these outcomes.

  14. CHAM: a fast algorithm of modelling non-linear matter power spectrum in the sCreened HAlo Model

    NASA Astrophysics Data System (ADS)

    Hu, Bin; Liu, Xue-Wen; Cai, Rong-Gen

    2018-05-01

    We present a fast numerical screened halo model algorithm (CHAM, which stands for the sCreened HAlo Model) for modelling non-linear power spectrum for the alternative models to Λ cold dark matter. This method has three obvious advantages. First of all, it is not being restricted to a specific dark energy/modified gravity model. In principle, all of the screened scalar-tensor theories can be applied. Secondly, the least assumptions are made in the calculation. Hence, the physical picture is very easily understandable. Thirdly, it is very predictable and does not rely on the calibration from N-body simulation. As an example, we show the case of the Hu-Sawicki f(R) gravity. In this case, the typical CPU time with the current parallel PYTHON script (eight threads) is roughly within 10 min. The resulting spectra are in a good agreement with N-body data within a few percentage accuracy up to k ˜ 1 h Mpc-1.

  15. Sensitivity and specificity of automated analysis of single-field non-mydriatic fundus photographs by Bosch DR Algorithm—Comparison with mydriatic fundus photography (ETDRS) for screening in undiagnosed diabetic retinopathy

    PubMed Central

    Bawankar, Pritam; Shanbhag, Nita; K., S. Smitha; Dhawan, Bodhraj; Palsule, Aratee; Kumar, Devesh; Chandel, Shailja

    2017-01-01

    Diabetic retinopathy (DR) is a leading cause of blindness among working-age adults. Early diagnosis through effective screening programs is likely to improve vision outcomes. The ETDRS seven-standard-field 35-mm stereoscopic color retinal imaging (ETDRS) of the dilated eye is elaborate and requires mydriasis, and is unsuitable for screening. We evaluated an image analysis application for the automated diagnosis of DR from non-mydriatic single-field images. Patients suffering from diabetes for at least 5 years were included if they were 18 years or older. Patients already diagnosed with DR were excluded. Physiologic mydriasis was achieved by placing the subjects in a dark room. Images were captured using a Bosch Mobile Eye Care fundus camera. The images were analyzed by the Retinal Imaging Bosch DR Algorithm for the diagnosis of DR. All subjects also subsequently underwent pharmacological mydriasis and ETDRS imaging. Non-mydriatic and mydriatic images were read by ophthalmologists. The ETDRS readings were used as the gold standard for calculating the sensitivity and specificity for the software. 564 consecutive subjects (1128 eyes) were recruited from six centers in India. Each subject was evaluated at a single outpatient visit. Forty-four of 1128 images (3.9%) could not be read by the algorithm, and were categorized as inconclusive. In four subjects, neither eye provided an acceptable image: these four subjects were excluded from the analysis. This left 560 subjects for analysis (1084 eyes). The algorithm correctly diagnosed 531 of 560 cases. The sensitivity, specificity, and positive and negative predictive values were 91%, 97%, 94%, and 95% respectively. The Bosch DR Algorithm shows favorable sensitivity and specificity in diagnosing DR from non-mydriatic images, and can greatly simplify screening for DR. This also has major implications for telemedicine in the use of screening for retinopathy in patients with diabetes mellitus. PMID:29281690

  16. Cost-effectiveness of HIV and syphilis antenatal screening: a modeling study

    PubMed Central

    Bristow, Claire C.; Larson, Elysia; Anderson, Laura J.; Klausner, Jeffrey D.

    2016-01-01

    Objectives The World Health Organization called for the elimination of maternal-to-child transmission (MTCT) of HIV and syphilis, a harmonized approach for the improvement of health outcomes for mothers and children. Testing early in pregnancy, treating seropositive pregnant women, and preventing syphilis re-infection can prevent MTCT of HIV and syphilis. We assessed the health and economic outcomes of a dual testing strategy in a simulated cohort of 100,000 antenatal care patients in Malawi. Methods We compared four screening algorithms: (1) HIV rapid test only, (2) dual HIV and syphilis rapid tests, (3) single rapid tests for HIV and syphilis, and (4) HIV rapid and syphilis laboratory tests. We calculated the expected number of adverse pregnancy outcomes, the expected costs, and the expected newborn disability adjusted life years (DALYs) for each screening algorithm. The estimated costs and DALYs for each screening algorithm were assessed from a societal perspective using Markov progression models. Additionally, we conducted a Monte Carlo multi-way sensitivity analysis, allowing for ranges of inputs. Results Our cohort decision model predicted the lowest number of adverse pregnancy outcomes in the dual HIV and syphilis rapid test strategy. Additionally, from the societal perspective, the costs of prevention and care using a dual HIV and syphilis rapid testing strategy was both the least costly ($226.92 per pregnancy) and resulted in the fewest DALYs (116,639) per 100,000 pregnancies. In the Monte Carlo simulation the dual HIV and syphilis algorithm was always cost saving and almost always reduced disability adjusted life years (DALYs) compared to HIV testing alone. Conclusion The results of the cost-effectiveness analysis showed that a dual HIV and syphilis test was cost saving compared to all other screening strategies. Adding dual rapid testing to the existing prevention of mother-to-child HIV transmission programs in Malawi and similar countries is likely to be advantageous. PMID:26920867

  17. Improving the Sensitivity and Positive Predictive Value in a Cystic Fibrosis Newborn Screening Program Using a Repeat Immunoreactive Trypsinogen and Genetic Analysis.

    PubMed

    Sontag, Marci K; Lee, Rachel; Wright, Daniel; Freedenberg, Debra; Sagel, Scott D

    2016-08-01

    To evaluate the performance of a new cystic fibrosis (CF) newborn screening algorithm, comprised of immunoreactive trypsinogen (IRT) in first (24-48 hours of life) and second (7-14 days of life) dried blood spot plus DNA on second dried blood spot, over existing algorithms. A retrospective review of the IRT/IRT/DNA algorithm implemented in Colorado, Wyoming, and Texas. A total of 1 520 079 newborns were screened, 32 557 (2.1%) had abnormal first IRT; 8794 (0.54%) on second. Furthermore, 14 653 mutation analyses were performed; 1391 newborns were referred for diagnostic testing; 274 newborns were diagnosed; and 201/274 (73%) of newborns had 2 mutations on the newborn screening CFTR panel. Sensitivity was 96.2%, compared with sensitivity of 76.1% observed with IRT/IRT (105 ng/mL cut-offs, P < .0001). The ratio of newborns with CF to heterozygote carriers was 1:2.5, and newborns with CF to newborns with CFTR-related metabolic syndrome was 10.8:1. The overall positive predictive value was 20%. The median age of diagnosis was 28, 30, and 39.5 days in the 3 states. IRT/IRT/DNA is more sensitive than IRT/IRT because of lower cut-offs (∼97 percentile or 60 ng/mL); higher cut-offs in IRT/IRT programs (>99 percentile, 105 ng/mL) would not achieve sufficient sensitivity. Carrier identification and identification of newborns with CFTR-related metabolic syndrome is less common in IRT/IRT/DNA compared with IRT/DNA. The time to diagnosis is nominally longer, but diagnosis can be achieved in the neonatal period and opportunities to further improve timeliness have been enacted. IRT/IRT/DNA algorithm should be considered by programs with 2 routine screens. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Evaluation of Cloud and Aerosol Screening of Early Orbiting Carbon Observatory-2 (OCO-2) Observations with Collocated MODIS Cloud Mask

    NASA Astrophysics Data System (ADS)

    Nelson, R. R.; Taylor, T.; O'Dell, C.; Cronk, H. Q.; Partain, P.; Frankenberg, C.; Eldering, A.; Crisp, D.; Gunson, M. R.; Chang, A.; Fisher, B.; Osterman, G. B.; Pollock, H. R.; Savtchenko, A.; Rosenthal, E. J.

    2015-12-01

    Effective cloud and aerosol screening is critically important to the Orbiting Carbon Observatory-2 (OCO-2), which can accurately determine column averaged dry air mole fraction of carbon dioxide (XCO2) only when scenes are sufficiently clear of scattering material. It is crucial to avoid sampling biases, in order to maintain a globally unbiased XCO2 record for inversion modeling to determine sources and sinks of carbon dioxide. This work presents analysis from the current operational B7 data set, which is identifying as clear approximately 20% of the order one million daily soundings. Of those soundings that are passed to the L2 retrieval algorithm, we find that almost 80% are yielding XCO2 estimates that converge. Two primary preprocessor algorithms are used to cloud screen the OCO-2 soundings. The A-Band Preprocessor (ABP) uses measurements in the Oxygen-A band near 0.76 microns (mm) to determine scenes with large photon path length modifications due to scattering by aerosol and clouds. The Iterative Maximum A-Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) algorithm (IDP) computes ratios of retrieved CO2 (and H2O) in the 1.6mm (weak CO2) and 2.0mm (strong CO2) spectral bands to determine scenes with spectral differences, indicating contamination by scattering materials. We demonstrate that applying these two algorithms in tandem provides robust cloud screening of the OCO-2 data set. We compare the OCO-2 cloud screening results to collocated Moderate Resolution Imaging Spectroradiometer (MODIS) cloud mask data and show that agreement between the two sensors is approximately 85-90%. A detailed statistical analysis is performed on a winter and spring 16-day repeat cycle for the nadir-land, glint-land and glint-water viewing geometries. No strong seasonal, spatial or footprint dependencies are found, although the agreement tends to be worse at high solar zenith angles and for snow and ice covered surfaces.

  19. Leaf Dynamics of Panicum maximum under Future Climatic Changes

    PubMed Central

    Britto de Assis Prado, Carlos Henrique; Haik Guedes de Camargo-Bortolin, Lívia; Castro, Érique; Martinez, Carlos Alberto

    2016-01-01

    Panicum maximum Jacq. ‘Mombaça’ (C4) was grown in field conditions with sufficient water and nutrients to examine the effects of warming and elevated CO2 concentrations during the winter. Plants were exposed to either the ambient temperature and regular atmospheric CO2 (Control); elevated CO2 (600 ppm, eC); canopy warming (+2°C above regular canopy temperature, eT); or elevated CO2 and canopy warming (eC+eT). The temperatures and CO2 in the field were controlled by temperature free-air controlled enhancement (T-FACE) and mini free-air CO2 enrichment (miniFACE) facilities. The most green, expanding, and expanded leaves and the highest leaf appearance rate (LAR, leaves day-1) and leaf elongation rate (LER, cm day-1) were observed under eT. Leaf area and leaf biomass were higher in the eT and eC+eT treatments. The higher LER and LAR without significant differences in the number of senescent leaves could explain why tillers had higher foliage area and leaf biomass in the eT treatment. The eC treatment had the lowest LER and the fewest expanded and green leaves, similar to Control. The inhibitory effect of eC on foliage development in winter was indicated by the fewer green, expanded, and expanding leaves under eC+eT than eT. The stimulatory and inhibitory effects of the eT and eC treatments, respectively, on foliage raised and lowered, respectively, the foliar nitrogen concentration. The inhibition of foliage by eC was confirmed by the eC treatment having the lowest leaf/stem biomass ratio and by the change in leaf biomass-area relationships from linear or exponential growth to rectangular hyperbolic growth under eC. Besides, eC+eT had a synergist effect, speeding up leaf maturation. Therefore, with sufficient water and nutrients in winter, the inhibitory effect of elevated CO2 on foliage could be partially offset by elevated temperatures and relatively high P. maximum foliage production could be achieved under future climatic change. PMID:26894932

  20. An Algorithm for Controlled Integration of Sound and Text.

    ERIC Educational Resources Information Center

    Wohlert, Harry S.; McCormick, Martin

    1985-01-01

    A serious drawback in introducing sound into computer programs for teaching foreign language speech has been the lack of an algorithm to turn off the cassette recorder immediately to keep screen text and audio in synchronization. This article describes a program which solves that problem. (SED)

  1. Improved genome-scale multi-target virtual screening via a novel collaborative filtering approach to cold-start problem

    PubMed Central

    Lim, Hansaim; Gray, Paul; Xie, Lei; Poleksic, Aleksandar

    2016-01-01

    Conventional one-drug-one-gene approach has been of limited success in modern drug discovery. Polypharmacology, which focuses on searching for multi-targeted drugs to perturb disease-causing networks instead of designing selective ligands to target individual proteins, has emerged as a new drug discovery paradigm. Although many methods for single-target virtual screening have been developed to improve the efficiency of drug discovery, few of these algorithms are designed for polypharmacology. Here, we present a novel theoretical framework and a corresponding algorithm for genome-scale multi-target virtual screening based on the one-class collaborative filtering technique. Our method overcomes the sparseness of the protein-chemical interaction data by means of interaction matrix weighting and dual regularization from both chemicals and proteins. While the statistical foundation behind our method is general enough to encompass genome-wide drug off-target prediction, the program is specifically tailored to find protein targets for new chemicals with little to no available interaction data. We extensively evaluate our method using a number of the most widely accepted gene-specific and cross-gene family benchmarks and demonstrate that our method outperforms other state-of-the-art algorithms for predicting the interaction of new chemicals with multiple proteins. Thus, the proposed algorithm may provide a powerful tool for multi-target drug design. PMID:27958331

  2. Improved genome-scale multi-target virtual screening via a novel collaborative filtering approach to cold-start problem.

    PubMed

    Lim, Hansaim; Gray, Paul; Xie, Lei; Poleksic, Aleksandar

    2016-12-13

    Conventional one-drug-one-gene approach has been of limited success in modern drug discovery. Polypharmacology, which focuses on searching for multi-targeted drugs to perturb disease-causing networks instead of designing selective ligands to target individual proteins, has emerged as a new drug discovery paradigm. Although many methods for single-target virtual screening have been developed to improve the efficiency of drug discovery, few of these algorithms are designed for polypharmacology. Here, we present a novel theoretical framework and a corresponding algorithm for genome-scale multi-target virtual screening based on the one-class collaborative filtering technique. Our method overcomes the sparseness of the protein-chemical interaction data by means of interaction matrix weighting and dual regularization from both chemicals and proteins. While the statistical foundation behind our method is general enough to encompass genome-wide drug off-target prediction, the program is specifically tailored to find protein targets for new chemicals with little to no available interaction data. We extensively evaluate our method using a number of the most widely accepted gene-specific and cross-gene family benchmarks and demonstrate that our method outperforms other state-of-the-art algorithms for predicting the interaction of new chemicals with multiple proteins. Thus, the proposed algorithm may provide a powerful tool for multi-target drug design.

  3. The optimal sequence and selection of screening test items to predict fall risk in older disabled women: the Women's Health and Aging Study.

    PubMed

    Lamb, Sarah E; McCabe, Chris; Becker, Clemens; Fried, Linda P; Guralnik, Jack M

    2008-10-01

    Falls are a major cause of disability, dependence, and death in older people. Brief screening algorithms may be helpful in identifying risk and leading to more detailed assessment. Our aim was to determine the most effective sequence of falls screening test items from a wide selection of recommended items including self-report and performance tests, and to compare performance with other published guidelines. Data were from a prospective, age-stratified, cohort study. Participants were 1002 community-dwelling women aged 65 years old or older, experiencing at least some mild disability. Assessments of fall risk factors were conducted in participants' homes. Fall outcomes were collected at 6 monthly intervals. Algorithms were built for prediction of any fall over a 12-month period using tree classification with cross-set validation. Algorithms using performance tests provided the best prediction of fall events, and achieved moderate to strong performance when compared to commonly accepted benchmarks. The items selected by the best performing algorithm were the number of falls in the last year and, in selected subpopulations, frequency of difficulty balancing while walking, a 4 m walking speed test, body mass index, and a test of knee extensor strength. The algorithm performed better than that from the American Geriatric Society/British Geriatric Society/American Academy of Orthopaedic Surgeons and other guidance, although these findings should be treated with caution. Suggestions are made on the type, number, and sequence of tests that could be used to maximize estimation of the probability of falling in older disabled women.

  4. Robust Bayesian Algorithm for Targeted Compound Screening in Forensic Toxicology.

    PubMed

    Woldegebriel, Michael; Gonsalves, John; van Asten, Arian; Vivó-Truyols, Gabriel

    2016-02-16

    As part of forensic toxicological investigation of cases involving unexpected death of an individual, targeted or untargeted xenobiotic screening of post-mortem samples is normally conducted. To this end, liquid chromatography (LC) coupled to high-resolution mass spectrometry (MS) is typically employed. For data analysis, almost all commonly applied algorithms are threshold-based (frequentist). These algorithms examine the value of a certain measurement (e.g., peak height) to decide whether a certain xenobiotic of interest (XOI) is present/absent, yielding a binary output. Frequentist methods pose a problem when several sources of information [e.g., shape of the chromatographic peak, isotopic distribution, estimated mass-to-charge ratio (m/z), adduct, etc.] need to be combined, requiring the approach to make arbitrary decisions at substep levels of data analysis. We hereby introduce a novel Bayesian probabilistic algorithm for toxicological screening. The method tackles the problem with a different strategy. It is not aimed at reaching a final conclusion regarding the presence of the XOI, but it estimates its probability. The algorithm effectively and efficiently combines all possible pieces of evidence from the chromatogram and calculates the posterior probability of the presence/absence of XOI features. This way, the model can accommodate more information by updating the probability if extra evidence is acquired. The final probabilistic result assists the end user to make a final decision with respect to the presence/absence of the xenobiotic. The Bayesian method was validated and found to perform better (in terms of false positives and false negatives) than the vendor-supplied software package.

  5. Computer-aided screening system for cervical precancerous cells based on field emission scanning electron microscopy and energy dispersive x-ray images and spectra

    NASA Astrophysics Data System (ADS)

    Jusman, Yessi; Ng, Siew-Cheok; Hasikin, Khairunnisa; Kurnia, Rahmadi; Osman, Noor Azuan Bin Abu; Teoh, Kean Hooi

    2016-10-01

    The capability of field emission scanning electron microscopy and energy dispersive x-ray spectroscopy (FE-SEM/EDX) to scan material structures at the microlevel and characterize the material with its elemental properties has inspired this research, which has developed an FE-SEM/EDX-based cervical cancer screening system. The developed computer-aided screening system consisted of two parts, which were the automatic features of extraction and classification. For the automatic features extraction algorithm, the image and spectra of cervical cells features extraction algorithm for extracting the discriminant features of FE-SEM/EDX data was introduced. The system automatically extracted two types of features based on FE-SEM/EDX images and FE-SEM/EDX spectra. Textural features were extracted from the FE-SEM/EDX image using a gray level co-occurrence matrix technique, while the FE-SEM/EDX spectra features were calculated based on peak heights and corrected area under the peaks using an algorithm. A discriminant analysis technique was employed to predict the cervical precancerous stage into three classes: normal, low-grade intraepithelial squamous lesion (LSIL), and high-grade intraepithelial squamous lesion (HSIL). The capability of the developed screening system was tested using 700 FE-SEM/EDX spectra (300 normal, 200 LSIL, and 200 HSIL cases). The accuracy, sensitivity, and specificity performances were 98.2%, 99.0%, and 98.0%, respectively.

  6. Information fusion for diabetic retinopathy CAD in digital color fundus photographs.

    PubMed

    Niemeijer, Meindert; Abramoff, Michael D; van Ginneken, Bram

    2009-05-01

    The purpose of computer-aided detection or diagnosis (CAD) technology has so far been to serve as a second reader. If, however, all relevant lesions in an image can be detected by CAD algorithms, use of CAD for automatic reading or prescreening may become feasible. This work addresses the question how to fuse information from multiple CAD algorithms, operating on multiple images that comprise an exam, to determine a likelihood that the exam is normal and would not require further inspection by human operators. We focus on retinal image screening for diabetic retinopathy, a common complication of diabetes. Current CAD systems are not designed to automatically evaluate complete exams consisting of multiple images for which several detection algorithm output sets are available. Information fusion will potentially play a crucial role in enabling the application of CAD technology to the automatic screening problem. Several different fusion methods are proposed and their effect on the performance of a complete comprehensive automatic diabetic retinopathy screening system is evaluated. Experiments show that the choice of fusion method can have a large impact on system performance. The complete system was evaluated on a set of 15,000 exams (60,000 images). The best performing fusion method obtained an area under the receiver operator characteristic curve of 0.881. This indicates that automated prescreening could be applied in diabetic retinopathy screening programs.

  7. An Image-Based Algorithm for Precise and Accurate High Throughput Assessment of Drug Activity against the Human Parasite Trypanosoma cruzi

    PubMed Central

    Moraes, Carolina Borsoi; Yang, Gyongseon; Kang, Myungjoo; Freitas-Junior, Lucio H.; Hansen, Michael A. E.

    2014-01-01

    We present a customized high content (image-based) and high throughput screening algorithm for the quantification of Trypanosoma cruzi infection in host cells. Based solely on DNA staining and single-channel images, the algorithm precisely segments and identifies the nuclei and cytoplasm of mammalian host cells as well as the intracellular parasites infecting the cells. The algorithm outputs statistical parameters including the total number of cells, number of infected cells and the total number of parasites per image, the average number of parasites per infected cell, and the infection ratio (defined as the number of infected cells divided by the total number of cells). Accurate and precise estimation of these parameters allow for both quantification of compound activity against parasites, as well as the compound cytotoxicity, thus eliminating the need for an additional toxicity-assay, hereby reducing screening costs significantly. We validate the performance of the algorithm using two known drugs against T.cruzi: Benznidazole and Nifurtimox. Also, we have checked the performance of the cell detection with manual inspection of the images. Finally, from the titration of the two compounds, we confirm that the algorithm provides the expected half maximal effective concentration (EC50) of the anti-T. cruzi activity. PMID:24503652

  8. The Initial Evaluation of Patients After Positive Newborn Screening: Recommended Algorithms Leading to a Confirmed Diagnosis of Pompe Disease.

    PubMed

    Burton, Barbara K; Kronn, David F; Hwu, Wuh-Liang; Kishnani, Priya S

    2017-07-01

    Newborn screening (NBS) for Pompe disease is done through analysis of acid α-glucosidase (GAA) activity in dried blood spots. When GAA levels are below established cutoff values, then second-tier testing is required to confirm or refute a diagnosis of Pompe disease. This article in the "Newborn Screening, Diagnosis, and Treatment for Pompe Disease" guidance supplement provides recommendations for confirmatory testing after a positive NBS result indicative of Pompe disease is obtained. Two algorithms were developed by the Pompe Disease Newborn Screening Working Group, a group of international experts on both NBS and Pompe disease, based on whether DNA sequencing is performed as part of the screening method. Using the recommendations in either algorithm will lead to 1 of 3 diagnoses: classic infantile-onset Pompe disease, late-onset Pompe disease, or no disease/not affected/carrier. Mutation analysis of the GAA gene is essential for confirming the biochemical diagnosis of Pompe disease. For NBS laboratories that do not have DNA sequencing capabilities, the responsibility of obtaining sequencing of the GAA gene will fall on the referral center. The recommendations for confirmatory testing and the initial evaluation are intended for a broad global audience. However, the Working Group recognizes that clinical practices, standards of care, and resource capabilities vary not only regionally, but also by testing centers. Individual patient needs and health status as well as local/regional insurance reimbursement programs and regulations also must be considered. Copyright © 2017 by the American Academy of Pediatrics.

  9. Empirical evaluation demonstrated importance of validating biomarkers for early detection of cancer in screening settings to limit the number of false-positive findings.

    PubMed

    Chen, Hongda; Knebel, Phillip; Brenner, Hermann

    2016-07-01

    Search for biomarkers for early detection of cancer is a very active area of research, but most studies are done in clinical rather than screening settings. We aimed to empirically evaluate the role of study setting for early detection marker identification and validation. A panel of 92 candidate cancer protein markers was measured in 35 clinically identified colorectal cancer patients and 35 colorectal cancer patients identified at screening colonoscopy. For each case group, we selected 38 controls without colorectal neoplasms at screening colonoscopy. Single-, two- and three-marker combinations discriminating cases and controls were identified in each setting and subsequently validated in the alternative setting. In all scenarios, a higher number of predictive biomarkers were initially detected in the clinical setting, but a substantially lower proportion of identified biomarkers could subsequently be confirmed in the screening setting. Confirmation rates were 50.0%, 84.5%, and 74.2% for one-, two-, and three-marker algorithms identified in the screening setting and were 42.9%, 18.6%, and 25.7% for algorithms identified in the clinical setting. Validation of early detection markers of cancer in a true screening setting is important to limit the number of false-positive findings. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Arithmetic Skills in Using Algorithms

    DTIC Science & Technology

    1990-06-01

    that this bulb is really defective? (table continues) 3 Table 1 (continued) Dyslexia Dyslexia is a disorder characterized by an impaired ability to...read. Two percent (2%) of all first graders have dyslexia . A screen4ng test for dyslexia has recently been devised that can be used with first graders...whether the child has dyslexia . The screening test is not completely accurate. For children who really have dyslexia , the screening test is positive

  11. Computer-aided diagnosis workstation and database system for chest diagnosis based on multi-helical CT images

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou

    2006-03-01

    Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.

  12. Dana-Farber Cancer Institute (DFCI): Computational Correction of Copy-number Effect in CRISPR-Cas9 Essentiality Screens of Cancer Cells | Office of Cancer Genomics

    Cancer.gov

    Genome-wide CRISPR-Cas9 screens were performed in 341 cell lines. The results were processed with the CERES algorithm to produce copy-number and guide-efficacy corrected gene-knockout effect estimates.

  13. Dana-Farber Cancer Institute (DFCI): Computational Correction of Copy-number Effect in CRISPR-Cas9 Essentiality Screens of Cancer Cells | Office of Cancer Genomics

    Cancer.gov

    Genome-wide CRISPR-Cas9 screens were performed in 341 cell lines. The results were processed with the CERES algorithm to produce copy-number and guide-efficacy corrected gene knockout effect estimates.

  14. Feature Screening in Ultrahigh Dimensional Cox's Model.

    PubMed

    Yang, Guangren; Yu, Ye; Li, Runze; Buu, Anne

    Survival data with ultrahigh dimensional covariates such as genetic markers have been collected in medical studies and other fields. In this work, we propose a feature screening procedure for the Cox model with ultrahigh dimensional covariates. The proposed procedure is distinguished from the existing sure independence screening (SIS) procedures (Fan, Feng and Wu, 2010, Zhao and Li, 2012) in that the proposed procedure is based on joint likelihood of potential active predictors, and therefore is not a marginal screening procedure. The proposed procedure can effectively identify active predictors that are jointly dependent but marginally independent of the response without performing an iterative procedure. We develop a computationally effective algorithm to carry out the proposed procedure and establish the ascent property of the proposed algorithm. We further prove that the proposed procedure possesses the sure screening property. That is, with the probability tending to one, the selected variable set includes the actual active predictors. We conduct Monte Carlo simulation to evaluate the finite sample performance of the proposed procedure and further compare the proposed procedure and existing SIS procedures. The proposed methodology is also demonstrated through an empirical analysis of a real data example.

  15. Receipt of Cancer Screening Is a Predictor of Life Expectancy.

    PubMed

    Goodwin, James S; Sheffield, Kristin; Li, Shuang; Tan, Alai

    2016-11-01

    Obtaining cancer screening on patients with limited life expectancy has been proposed as a measure for low quality care for primary care physicians (PCPs). However, administrative data may underestimate life expectancy in patients who undergo screening. To determine the association between receipt of screening mammography or PSA and overall survival. Retrospective cohort study from 1/1/1999 to 12/31/2012. Receipt of screening was assessed for 2001-2002 and survival from 1/1/2003 to 12/31/2012. Life expectancy was estimated as of 1/1/03 using a validated algorithm, and was compared to actual survival for men and women, stratified by receipt of cancer screening. A 5 % sample of Medicare beneficiaries aged 69-90 years as of 1/1/2003 (n = 906,723). Receipt of screening mammography in 2001-2002 for women, or a screening PSA test in 2002 for men. Survival from 1/1/2003 through 12/31/2012. Subjects were stratified by life expectancy based on age and comorbidity. Within each stratum, the subjects with prior cancer screening had actual median survivals higher than those who were not screened, with differences ranging from 1.7 to 2.1 years for women and 0.9 to 1.1 years for men. In a Cox model, non-receipt of screening in women had an impact on survival (HR = 1.52; 95 % CI = 1.51, 1.54) similar in magnitude to a diagnosis of complicated diabetes or heart failure, and was comparable to uncomplicated diabetes or liver disease in men (HR = 1.23; 1.22, 1.25). Receipt of cancer screening is a powerful marker of health status that is not captured by comorbidity measures in administrative data. Because life expectancy algorithms using administrative data underestimate the life expectancy of patients who undergo screening, they can overestimate the problem of cancer screening in patients with limited life expectancy.

  16. Combat Aircraft Maneuverability.

    DTIC Science & Technology

    1981-12-01

    entendu : lorsqu’in tente de q Viess anglai detanagecontr6ler l’aasiette latdrale) r :Vitesse angulaire de lacet 1. - Introduction : De nombreux symposiums...in the Central European environment are not related to manoeuvrability but to the availa- bility of military subsystems. iNTRODUCTION I. The mission of...crowded air-to-air arena of more importance. DESIRED IMPROVEMENTS 12. Introduction . In the previous paragraphs I reached the conclusions that in an

  17. Benthic and Sedimentologic Studies on the Charleston Harbor Ocean Disposal Area, Charleston Harbor Deepening Project.

    DTIC Science & Technology

    1979-08-01

    Nolella stipate 1 Phylum S ipuncula Sipunculid (undet.) 14 Phylum Echiura Echiurid (undet.) 1 Phylum-. Annelida Hvdroides dianthus 27 Sabellaria...undet.) LEr-idomotus subievi.s Sab)ellaria \\’ulgaris HIvdroides dianthus 9 lu- Mloliusca * Polinices duplicatus0 Modiclus mocilolus squarnosus...tenuis Microporella ciliata Schizoporella cornuta Phvlum Annelida H-vdroides dianthus Sabelliaria x’ulgaris 41Ph’iu MToi 7lusca Creoidula olana Phylum

  18. Mode Reduction and Upscaling of Reactive Transport Under Incomplete Mixing

    NASA Astrophysics Data System (ADS)

    Lester, D. R.; Bandopadhyay, A.; Dentz, M.; Le Borgne, T.

    2016-12-01

    Upscaling of chemical reactions in partially-mixed fluid environments is a challenging problem due to the detailed interactions between inherently nonlinear reaction kinetics and complex spatio-temporal concentration distributions under incomplete mixing. We address this challenge via the development of an order reduction method for the advection-diffusion-reaction equation (ADRE) via projection of the reaction kinetics onto a small number N of leading eigenmodes of the advection-diffusion operator (the so-called "strange eigenmodes" of the flow) as an N-by-N nonlinear system, whilst mixing dynamics only are projected onto the remaining modes. For simple kinetics and moderate Péclet and Damkhöler numbers, this approach yields analytic solutions for the concentration mean, evolving spatio-temporal distribution and PDF in terms of the well-mixed reaction kinetics and mixing dynamics. For more complex kinetics or large Péclet or Damkhöler numbers only a small number of modes are required to accurately quantify the mixing and reaction dynamics in terms of the concentration field and PDF, facilitating greatly simplified approximation and analysis of reactive transport. Approximate solutions of this low-order nonlinear system provide quantiative predictions of the evolving concentration PDF. We demonstrate application of this method to a simple random flow and various mass-action reaction kinetics.

  19. Synoptic ozone, cloud reflectivity, and erythemal irradiance from sunrise to sunset for the whole earth as viewed by the DSCOVR spacecraft from the earth-sun Lagrange 1 orbit

    NASA Astrophysics Data System (ADS)

    Herman, Jay; Huang, Liang; McPeters, Richard; Ziemke, Jerry; Cede, Alexander; Blank, Karin

    2018-01-01

    EPIC (Earth Polychromatic Imaging Camera) on board the DSCOVR (Deep Space Climate Observatory) spacecraft is the first earth science instrument located near the earth-sun gravitational plus centrifugal force balance point, Lagrange 1. EPIC measures earth-reflected radiances in 10 wavelength channels ranging from 317.5 to 779.5 nm. Of these channels, four are in the UV range 317.5, 325, 340, and 388 nm, which are used to retrieve O3, 388 nm scene reflectivity (LER: Lambert equivalent reflectivity), SO2, and aerosol properties. These new synoptic quantities are retrieved for the entire sunlit globe from sunrise to sunset multiple times per day as the earth rotates in EPIC's field of view. Retrieved ozone amounts agree with ground-based measurements and satellite data to within 3 %. The ozone amounts and LER are combined to derive the erythemal irradiance for the earth's entire sunlit surface at a nadir resolution of 18 × 18 km2 using a computationally efficient approximation to a radiative transfer calculation of irradiance. The results show very high summertime values of the UV index (UVI) in the Andes and Himalayas (greater than 18), and high values of UVI near the Equator at equinox.

  20. Human Habitation in a Lunar Electric Rover During a 14-Day Field Trial

    NASA Technical Reports Server (NTRS)

    Litaker, Harry, Jr.; Thompson, Shelby; Howard, Robert, Jr.

    2010-01-01

    Various military and commercial entities, as well as the National Aeronautics and Space Administration (NASA), have conducted space cabin confinement studies. However, after an extensive literature search, only one study was found using a simulated lunar rover (LUNEX II), under laboratory conditions, with a crew of two for an eighteen day lunar mission. Forty-three years later, NASA human factors engineers conducted a similar study using the Lunar Electric Rover (LER) in a dynamic real-world lunar simulation at the Black Point Lava Flow in Arizona. The objective of the study was to obtain human-in-the-loop performance data on the vehicle s interior volume with respect to human-system interfaces, crew accommodations, and habitation over a 14-day mission. Though part of a larger study including 212 overall operational elements, this paper will discuss only the performance of fifty different daily habitational elements within the confines of the vehicle carried out by two male subjects. Objective timing data and subjective questionnaire data were collected. Results indicate, much like the LUNEX II study, the LER field study suggest that a crew of two was able to maintain a satisfactory performance of tasks throughout the 14-day field trail within a relative small vehicle volume.

  1. Exploration of suitable dry etch technologies for directed self-assembly

    NASA Astrophysics Data System (ADS)

    Yamashita, Fumiko; Nishimura, Eiichi; Yatsuda, Koichi; Mochiki, Hiromasa; Bannister, Julie

    2012-03-01

    Directed self-assembly (DSA) has shown the potential to replace traditional resist patterns and provide a lower cost alternative for sub-20-nm patterns. One of the possible roadblocks for DSA implementation is the ability to etch the polymers to produce quality masks for subsequent etch processes. We have studied the effects of RF frequency and etch chemistry for dry developing DSA patterns. The results of the study showed a capacitively-coupled plasma (CCP) reactor with very high frequency (VHF) had superior pattern development after the block co-polymer (BCP) etch. The VHF CCP demonstrated minimal BCP height loss and line edge roughness (LER)/line width roughness (LWR). The advantage of CCP over ICP is the low dissociation so the etch rate of BCP is maintained low enough for process control. Additionally, the advantage of VHF is the low electron energy with a tight ion energy distribution that enables removal of the polymethyl methacrylate (PMMA) with good selectivity to polystyrene (PS) and minimal LER/LWR. Etch chemistries were evaluated on the VHF CCP to determine ability to treat the BCPs to increase etch resistance and feature resolution. The right combination of RF source frequencies and etch chemistry can help overcome the challenges of using DSA patterns to create good etch results.

  2. A pan-Canadian practice guideline and algorithm: screening, assessment, and supportive care of adults with cancer-related fatigue

    PubMed Central

    Howell, D.; Keller–Olaman, S.; Oliver, T.K.; Hack, T.F.; Broadfield, L.; Biggs, K.; Chung, J.; Gravelle, D.; Green, E.; Hamel, M.; Harth, T.; Johnston, P.; McLeod, D.; Swinton, N.; Syme, A.; Olson, K.

    2013-01-01

    Purpose The purpose of the present systematic review was to develop a practice guideline to inform health care providers about screening, assessment, and effective management of cancer-related fatigue (crf) in adults. Methods The internationally endorsed adapte methodology was used to develop a practice guideline for pan-Canadian use. A systematic search of the literature identified a broad range of evidence: clinical practice guidelines, systematic reviews, and other guidance documents on the screening, assessment, and management of crf. The search included medline, embase, cinahl, the Cochrane Library, and other guideline and data sources to December 2009. Results Two clinical practice guidelines were identified for adaptation. Seven guidance documents and four systematic reviews also provided supplementary evidence to inform guideline recommendations. Health professionals across Canada provided expert feedback on the adapted recommendations in the practice guideline and algorithm through a participatory external review process. Conclusions Practice guidelines can facilitate the adoption of evidence-based assessment and interventions for adult cancer patients experiencing fatigue. Development of an algorithm to guide decision-making in practice may also foster the uptake of a guideline into routine care. PMID:23737693

  3. Determination of Cole-Cole parameters using only the real part of electrical impedivity measurements.

    PubMed

    Miranda, David A; Rivera, S A López

    2008-05-01

    An algorithm is presented to determine the Cole-Cole parameters of electrical impedivity using only measurements of its real part. The algorithm is based on two multi-fold direct inversion methods for the Cole-Cole and Debye equations, respectively, and a genetic algorithm for the optimization of the mean square error between experimental and calculated data. The algorithm has been developed to obtain the Cole-Cole parameters from experimental data, which were used to screen cervical intra-epithelial neoplasia. The proposed algorithm was compared with different numerical integrations of the Kramers-Kronig relation and the result shows that this algorithm is the best. A high immunity to noise was obtained.

  4. Image quality classification for DR screening using deep learning.

    PubMed

    FengLi Yu; Jing Sun; Annan Li; Jun Cheng; Cheng Wan; Jiang Liu

    2017-07-01

    The quality of input images significantly affects the outcome of automated diabetic retinopathy (DR) screening systems. Unlike the previous methods that only consider simple low-level features such as hand-crafted geometric and structural features, in this paper we propose a novel method for retinal image quality classification (IQC) that performs computational algorithms imitating the working of the human visual system. The proposed algorithm combines unsupervised features from saliency map and supervised features coming from convolutional neural networks (CNN), which are fed to an SVM to automatically detect high quality vs poor quality retinal fundus images. We demonstrate the superior performance of our proposed algorithm on a large retinal fundus image dataset and the method could achieve higher accuracy than other methods. Although retinal images are used in this study, the methodology is applicable to the image quality assessment and enhancement of other types of medical images.

  5. Cloud screening Coastal Zone Color Scanner images using channel 5

    NASA Technical Reports Server (NTRS)

    Eckstein, B. A.; Simpson, J. J.

    1991-01-01

    Clouds are removed from Coastal Zone Color Scanner (CZCS) data using channel 5. Instrumentation problems require pre-processing of channel 5 before an intelligent cloud-screening algorithm can be used. For example, at intervals of about 16 lines, the sensor records anomalously low radiances. Moreover, the calibration equation yields negative radiances when the sensor records zero counts, and pixels corrupted by electronic overshoot must also be excluded. The remaining pixels may then be used in conjunction with the procedure of Simpson and Humphrey to determine the CZCS cloud mask. These results plus in situ observations of phytoplankton pigment concentration show that pre-processing and proper cloud-screening of CZCS data are necessary for accurate satellite-derived pigment concentrations. This is especially true in the coastal margins, where pigment content is high and image distortion associated with electronic overshoot is also present. The pre-processing algorithm is critical to obtaining accurate global estimates of pigment from spacecraft data.

  6. Rationally reduced libraries for combinatorial pathway optimization minimizing experimental effort.

    PubMed

    Jeschek, Markus; Gerngross, Daniel; Panke, Sven

    2016-03-31

    Rational flux design in metabolic engineering approaches remains difficult since important pathway information is frequently not available. Therefore empirical methods are applied that randomly change absolute and relative pathway enzyme levels and subsequently screen for variants with improved performance. However, screening is often limited on the analytical side, generating a strong incentive to construct small but smart libraries. Here we introduce RedLibs (Reduced Libraries), an algorithm that allows for the rational design of smart combinatorial libraries for pathway optimization thereby minimizing the use of experimental resources. We demonstrate the utility of RedLibs for the design of ribosome-binding site libraries by in silico and in vivo screening with fluorescent proteins and perform a simple two-step optimization of the product selectivity in the branched multistep pathway for violacein biosynthesis, indicating a general applicability for the algorithm and the proposed heuristics. We expect that RedLibs will substantially simplify the refactoring of synthetic metabolic pathways.

  7. Regional Evaluation of the Severity-Based Stroke Triage Algorithm for Emergency Medical Services Using Discrete Event Simulation.

    PubMed

    Bogle, Brittany M; Asimos, Andrew W; Rosamond, Wayne D

    2017-10-01

    The Severity-Based Stroke Triage Algorithm for Emergency Medical Services endorses routing patients with suspected large vessel occlusion acute ischemic strokes directly to endovascular stroke centers (ESCs). We sought to evaluate different specifications of this algorithm within a region. We developed a discrete event simulation environment to model patients with suspected stroke transported according to algorithm specifications, which varied by stroke severity screen and permissible additional transport time for routing patients to ESCs. We simulated King County, Washington, and Mecklenburg County, North Carolina, distributing patients geographically into census tracts. Transport time to the nearest hospital and ESC was estimated using traffic-based travel times. We assessed undertriage, overtriage, transport time, and the number-needed-to-route, defined as the number of patients enduring additional transport to route one large vessel occlusion patient to an ESC. Undertriage was higher and overtriage was lower in King County compared with Mecklenburg County for each specification. Overtriage variation was primarily driven by screen (eg, 13%-55% in Mecklenburg County and 10%-40% in King County). Transportation time specifications beyond 20 minutes increased overtriage and decreased undertriage in King County but not Mecklenburg County. A low- versus high-specificity screen routed 3.7× more patients to ESCs. Emergency medical services spent nearly twice the time routing patients to ESCs in King County compared with Mecklenburg County. Our results demonstrate how discrete event simulation can facilitate informed decision making to optimize emergency medical services stroke severity-based triage algorithms. This is the first step toward developing a mature simulation to predict patient outcomes. © 2017 American Heart Association, Inc.

  8. Evaluation of the genotypic prediction of HIV-1 coreceptor use versus a phenotypic assay and correlation with the virological response to maraviroc: the ANRS GenoTropism study.

    PubMed

    Recordon-Pinson, Patricia; Soulié, Cathia; Flandre, Philippe; Descamps, Diane; Lazrek, Mouna; Charpentier, Charlotte; Montes, Brigitte; Trabaud, Mary-Anne; Cottalorda, Jacqueline; Schneider, Véronique; Morand-Joubert, Laurence; Tamalet, Catherine; Desbois, Delphine; Macé, Muriel; Ferré, Virginie; Vabret, Astrid; Ruffault, Annick; Pallier, Coralie; Raymond, Stéphanie; Izopet, Jacques; Reynes, Jacques; Marcelin, Anne-Geneviève; Masquelier, Bernard

    2010-08-01

    Genotypic algorithms for prediction of HIV-1 coreceptor usage need to be evaluated in a clinical setting. We aimed at studying (i) the correlation of genotypic prediction of coreceptor use in comparison with a phenotypic assay and (ii) the relationship between genotypic prediction of coreceptor use at baseline and the virological response (VR) to a therapy including maraviroc (MVC). Antiretroviral-experienced patients were included in the MVC Expanded Access Program if they had an R5 screening result with Trofile (Monogram Biosciences). V3 loop sequences were determined at screening, and coreceptor use was predicted using 13 genotypic algorithms or combinations of algorithms. Genotypic predictions were compared to Trofile; dual or mixed (D/M) variants were considered as X4 variants. Both genotypic and phenotypic results were obtained for 189 patients at screening, with 54 isolates scored as X4 or D/M and 135 scored as R5 with Trofile. The highest sensitivity (59.3%) for detection of X4 was obtained with the Geno2pheno algorithm, with a false-positive rate set up at 10% (Geno2pheno10). In the 112 patients receiving MVC, a plasma viral RNA load of <50 copies/ml was obtained in 68% of cases at month 6. In multivariate analysis, the prediction of the X4 genotype at baseline with the Geno2pheno10 algorithm including baseline viral load and CD4 nadir was independently associated with a worse VR at months 1 and 3. The baseline weighted genotypic sensitivity score was associated with VR at month 6. There were strong arguments in favor of using genotypic coreceptor use assays for determining which patients would respond to CCR5 antagonist.

  9. Optic disc segmentation for glaucoma screening system using fundus images.

    PubMed

    Almazroa, Ahmed; Sun, Weiwei; Alodhayb, Sami; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2017-01-01

    Segmenting the optic disc (OD) is an important and essential step in creating a frame of reference for diagnosing optic nerve head pathologies such as glaucoma. Therefore, a reliable OD segmentation technique is necessary for automatic screening of optic nerve head abnormalities. The main contribution of this paper is in presenting a novel OD segmentation algorithm based on applying a level set method on a localized OD image. To prevent the blood vessels from interfering with the level set process, an inpainting technique was applied. As well an important contribution was to involve the variations in opinions among the ophthalmologists in detecting the disc boundaries and diagnosing the glaucoma. Most of the previous studies were trained and tested based on only one opinion, which can be assumed to be biased for the ophthalmologist. In addition, the accuracy was calculated based on the number of images that coincided with the ophthalmologists' agreed-upon images, and not only on the overlapping images as in previous studies. The ultimate goal of this project is to develop an automated image processing system for glaucoma screening. The disc algorithm is evaluated using a new retinal fundus image dataset called RIGA (retinal images for glaucoma analysis). In the case of low-quality images, a double level set was applied, in which the first level set was considered to be localization for the OD. Five hundred and fifty images are used to test the algorithm accuracy as well as the agreement among the manual markings of six ophthalmologists. The accuracy of the algorithm in marking the optic disc area and centroid was 83.9%, and the best agreement was observed between the results of the algorithm and manual markings in 379 images.

  10. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to the phenology, solar-view geometry, and atmospheric condition etc. factors but not actual landcover difference. Finally, we will compare the classification results from screened and unscreened training samples to assess the improvement achieved by cleaning up the training samples. Keywords:

  11. Simulating large atmospheric phase screens using a woofer-tweeter algorithm.

    PubMed

    Buscher, David F

    2016-10-03

    We describe an algorithm for simulating atmospheric wavefront perturbations over ranges of spatial and temporal scales spanning more than 4 orders of magnitude. An open-source implementation of the algorithm written in Python can simulate the evolution of the perturbations more than an order-of-magnitude faster than real time. Testing of the implementation using metrics appropriate to adaptive optics systems and long-baseline interferometers show accuracies at the few percent level or better.

  12. Evaluation of the performance of 3D virtual screening protocols: RMSD comparisons, enrichment assessments, and decoy selection--what can we learn from earlier mistakes?

    PubMed

    Kirchmair, Johannes; Markt, Patrick; Distinto, Simona; Wolber, Gerhard; Langer, Thierry

    2008-01-01

    Within the last few years a considerable amount of evaluative studies has been published that investigate the performance of 3D virtual screening approaches. Thereby, in particular assessments of protein-ligand docking are facing remarkable interest in the scientific community. However, comparing virtual screening approaches is a non-trivial task. Several publications, especially in the field of molecular docking, suffer from shortcomings that are likely to affect the significance of the results considerably. These quality issues often arise from poor study design, biasing, by using improper or inexpressive enrichment descriptors, and from errors in interpretation of the data output. In this review we analyze recent literature evaluating 3D virtual screening methods, with focus on molecular docking. We highlight problematic issues and provide guidelines on how to improve the quality of computational studies. Since 3D virtual screening protocols are in general assessed by their ability to discriminate between active and inactive compounds, we summarize the impact of the composition and preparation of test sets on the outcome of evaluations. Moreover, we investigate the significance of both classic enrichment parameters and advanced descriptors for the performance of 3D virtual screening methods. Furthermore, we review the significance and suitability of RMSD as a measure for the accuracy of protein-ligand docking algorithms and of conformational space sub sampling algorithms.

  13. Development of Repair Materials for Avulsive Combat-Type Maxillofacial Injuries.

    DTIC Science & Technology

    1981-11-01

    sone. A sample of 0.2702 grams of the resulting corn- nitic acid. tsocitric acid, alpha - ketoglutaric acid. suc- position was shaped into the form of a...polylactic acid and inorganic materials such as calcium carbonate or calcium sulfate. The chemistry of the system proposed for development includes...acid and high molecular weight poly(propylene fumarate) were synthesized at Dynatech to be used as organic fillers. Inorganic fil- lers such as calcium

  14. Automated two-point dixon screening for the evaluation of hepatic steatosis and siderosis: comparison with R2-relaxometry and chemical shift-based sequences.

    PubMed

    Henninger, B; Zoller, H; Rauch, S; Schocke, M; Kannengiesser, S; Zhong, X; Reiter, G; Jaschke, W; Kremser, C

    2015-05-01

    To evaluate the automated two-point Dixon screening sequence for the detection and estimated quantification of hepatic iron and fat compared with standard sequences as a reference. One hundred and two patients with suspected diffuse liver disease were included in this prospective study. The following MRI protocol was used: 3D-T1-weighted opposed- and in-phase gradient echo with two-point Dixon reconstruction and dual-ratio signal discrimination algorithm ("screening" sequence); fat-saturated, multi-gradient-echo sequence with 12 echoes; gradient-echo T1 FLASH opposed- and in-phase. Bland-Altman plots were generated and correlation coefficients were calculated to compare the sequences. The screening sequence diagnosed fat in 33, iron in 35 and a combination of both in 4 patients. Correlation between R2* values of the screening sequence and the standard relaxometry was excellent (r = 0.988). A slightly lower correlation (r = 0.978) was found between the fat fraction of the screening sequence and the standard sequence. Bland-Altman revealed systematically lower R2* values obtained from the screening sequence and higher fat fraction values obtained with the standard sequence with a rather high variability in agreement. The screening sequence is a promising method with fast diagnosis of the predominant liver disease. It is capable of estimating the amount of hepatic fat and iron comparable to standard methods. • MRI plays a major role in the clarification of diffuse liver disease. • The screening sequence was introduced for the assessment of diffuse liver disease. • It is a fast and automated algorithm for the evaluation of hepatic iron and fat. • It is capable of estimating the amount of hepatic fat and iron.

  15. Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.

    PubMed

    Doyle, Kelly; Strathmann, Frederick G

    2017-02-01

    This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.

  16. Blinded Validation of Breath Biomarkers of Lung Cancer, a Potential Ancillary to Chest CT Screening

    PubMed Central

    Phillips, Michael; Bauer, Thomas L.; Cataneo, Renee N.; Lebauer, Cassie; Mundada, Mayur; Pass, Harvey I.; Ramakrishna, Naren; Rom, William N.; Vallières, Eric

    2015-01-01

    Background Breath volatile organic compounds (VOCs) have been reported as biomarkers of lung cancer, but it is not known if biomarkers identified in one group can identify disease in a separate independent cohort. Also, it is not known if combining breath biomarkers with chest CT has the potential to improve the sensitivity and specificity of lung cancer screening. Methods Model-building phase (unblinded): Breath VOCs were analyzed with gas chromatography mass spectrometry in 82 asymptomatic smokers having screening chest CT, 84 symptomatic high-risk subjects with a tissue diagnosis, 100 without a tissue diagnosis, and 35 healthy subjects. Multiple Monte Carlo simulations identified breath VOC mass ions with greater than random diagnostic accuracy for lung cancer, and these were combined in a multivariate predictive algorithm. Model-testing phase (blinded validation): We analyzed breath VOCs in an independent cohort of similar subjects (n = 70, 51, 75 and 19 respectively). The algorithm predicted discriminant function (DF) values in blinded replicate breath VOC samples analyzed independently at two laboratories (A and B). Outcome modeling: We modeled the expected effects of combining breath biomarkers with chest CT on the sensitivity and specificity of lung cancer screening. Results Unblinded model-building phase. The algorithm identified lung cancer with sensitivity 74.0%, specificity 70.7% and C-statistic 0.78. Blinded model-testing phase: The algorithm identified lung cancer at Laboratory A with sensitivity 68.0%, specificity 68.4%, C-statistic 0.71; and at Laboratory B with sensitivity 70.1%, specificity 68.0%, C-statistic 0.70, with linear correlation between replicates (r = 0.88). In a projected outcome model, breath biomarkers increased the sensitivity, specificity, and positive and negative predictive values of chest CT for lung cancer when the tests were combined in series or parallel. Conclusions Breath VOC mass ion biomarkers identified lung cancer in a separate independent cohort, in a blinded replicated study. Combining breath biomarkers with chest CT could potentially improve the sensitivity and specificity of lung cancer screening. Trial Registration ClinicalTrials.gov NCT00639067 PMID:26698306

  17. Imaging tilted transversely isotropic media with a generalised screen propagator

    NASA Astrophysics Data System (ADS)

    Shin, Sung-Il; Byun, Joongmoo; Seol, Soon Jee

    2015-01-01

    One-way wave equation migration is computationally efficient compared with reverse time migration, and it provides a better subsurface image than ray-based migration algorithms when imaging complex structures. Among many one-way wave-based migration algorithms, we adopted the generalised screen propagator (GSP) to build the migration algorithm. When the wavefield propagates through the large velocity variation in lateral or steeply dipping structures, GSP increases the accuracy of the wavefield in wide angle by adopting higher-order terms induced from expansion of the vertical slowness in Taylor series with each perturbation term. To apply the migration algorithm to a more realistic geological structure, we considered tilted transversely isotropic (TTI) media. The new GSP, which contains the tilting angle as a symmetric axis of the anisotropic media, was derived by modifying the GSP designed for vertical transversely isotropic (VTI) media. To verify the developed TTI-GSP, we analysed the accuracy of wave propagation, especially for the new perturbation parameters and the tilting angle; the results clearly showed that the perturbation term of the tilting angle in TTI media has considerable effects on proper propagation. In addition, through numerical tests, we demonstrated that the developed TTI-GS migration algorithm could successfully image a steeply dipping salt flank with high velocity variation around anisotropic layers.

  18. Cell-veto Monte Carlo algorithm for long-range systems.

    PubMed

    Kapfer, Sebastian C; Krauth, Werner

    2016-09-01

    We present a rigorous efficient event-chain Monte Carlo algorithm for long-range interacting particle systems. Using a cell-veto scheme within the factorized Metropolis algorithm, we compute each single-particle move with a fixed number of operations. For slowly decaying potentials such as Coulomb interactions, screening line charges allow us to take into account periodic boundary conditions. We discuss the performance of the cell-veto Monte Carlo algorithm for general inverse-power-law potentials, and illustrate how it provides a new outlook on one of the prominent bottlenecks in large-scale atomistic Monte Carlo simulations.

  19. CRISPR-FOCUS: A web server for designing focused CRISPR screening experiments.

    PubMed

    Cao, Qingyi; Ma, Jian; Chen, Chen-Hao; Xu, Han; Chen, Zhi; Li, Wei; Liu, X Shirley

    2017-01-01

    The recently developed CRISPR screen technology, based on the CRISPR/Cas9 genome editing system, enables genome-wide interrogation of gene functions in an efficient and cost-effective manner. Although many computational algorithms and web servers have been developed to design single-guide RNAs (sgRNAs) with high specificity and efficiency, algorithms specifically designed for conducting CRISPR screens are still lacking. Here we present CRISPR-FOCUS, a web-based platform to search and prioritize sgRNAs for CRISPR screen experiments. With official gene symbols or RefSeq IDs as the only mandatory input, CRISPR-FOCUS filters and prioritizes sgRNAs based on multiple criteria, including efficiency, specificity, sequence conservation, isoform structure, as well as genomic variations including Single Nucleotide Polymorphisms and cancer somatic mutations. CRISPR-FOCUS also provides pre-defined positive and negative control sgRNAs, as well as other necessary sequences in the construct (e.g., U6 promoters to drive sgRNA transcription and RNA scaffolds of the CRISPR/Cas9). These features allow users to synthesize oligonucleotides directly based on the output of CRISPR-FOCUS. Overall, CRISPR-FOCUS provides a rational and high-throughput approach for sgRNA library design that enables users to efficiently conduct a focused screen experiment targeting up to thousands of genes. (CRISPR-FOCUS is freely available at http://cistrome.org/crispr-focus/).

  20. Online Mapping and Perception Algorithms for Multi-robot Teams Operating in Urban Environments

    DTIC Science & Technology

    2015-01-01

    each method on a 2.53 GHz Intel i5 laptop. All our algorithms are hand-optimized, implemented in Java and single threaded. To determine which algorithm...approach would be to label all the pixels in the image with an x, y, z point. However, the angular resolution of the camera is finer than that of the...edge criterion. That is, each edge is either present or absent. In [42], edge existence is further screened by a fixed threshold for angular

  1. Screening for Autism Spectrum Disorders in Children With Down Syndrome

    PubMed Central

    DiGuiseppi, Carolyn; Hepburn, Susan; Davis, Jonathan M.; Fidler, Deborah J.; Hartway, Sara; Lee, Nancy Raitano; Miller, Lisa; Ruttenber, Margaret; Robinson, Cordelia

    2015-01-01

    Objective We assessed the prevalence of autism spectrum disorders (ASD) and screening test characteristics in children with Down syndrome. Method Eligible children born in a defined geographic area between January 1, 1996, and December 31, 2003, were recruited through a population-based birth defects registry and community outreach, then screened with the modified checklist for autism in toddlers or social communication questionnaire, as appropriate. Screen-positive children and a random sample of screen-negative children underwent developmental evaluation. Results We screened 123 children (27.8% of the birth cohort). Mean age was 73.4 months (range, 31–142). Compared to screen-negative children, screen-positive children had similar sociodemo-graphic characteristics but a lower mean developmental quotient (mean difference: 11.0; 95% confidence interval: 4.8–17.3). Weighted prevalences of autistic disorder and total ASD were 6.4% (95% confidence interval [CI]: 2.6%–11.6%) and 18.2% (95% CI: 9.7%–26.8%), respectively. The estimated minimum ASD prevalence, accounting for unscreened children, is 5.1% (95% CI: 3.3%–7.4%). ASD prevalence increased with greater cognitive impairment. Screening test sensitivity was 87.5% (95% CI: 66.6%–97.7%); specificity was 49.9% (95% CI: 37.0%–61.4%). Conclusion The prevalence of ASD among children with Down syndrome aged 2 to 11 years is substantially higher than in the general population. The modified checklist for autism in toddlers and social communication questionnaire were highly sensitive in children with Down syndrome but could result in many false positive tests if universal screening were implemented using current algorithms. Research needs include development of specific ASD screening algorithms and improved diagnostic discrimination in children with Down syndrome. Timely identification of these co-occurring diagnoses is essential so appropriate interventions can be provided. PMID:20375732

  2. A nurse-facilitated depression screening program in an Army primary care clinic: an evidence-based project.

    PubMed

    Yackel, Edward E; McKennan, Madelyn S; Fox-Deise, Adrianna

    2010-01-01

    Depression, sometimes with suicidal manifestations, is a medical condition commonly seen in primary care clinics. Routine screening for depression and suicidal ideation is recommended of all adult patients in the primary care setting because it offers depressed patients a greater chance of recovery and response to treatment, yet such screening often is overlooked or omitted. The purpose of this study was to develop, to implement, and to test the efficacy of a systematic depression screening process to increase the identification of depression in family members of active duty soldiers older than 18 years at a military family practice clinic located on an Army infantry post in the Pacific. The Iowa Model of Evidence-Based Practice to Promote Quality Care was used to develop a practice guideline incorporating a decision algorithm for nurses to screen for depression. A pilot project to institute this change in practice was conducted, and outcomes were measured. Before implementation, approximately 100 patients were diagnosed with depression in each of the 3 months preceding the practice change. Approximately 130 patients a month were assigned a 311.0 Code 3 months after the practice change, and 140 patients per month received screenings and were assigned the correct International Classification of Diseases, Ninth Revision Code 311.0 at 1 year. The improved screening and coding for depression and suicidality added approximately 3 minutes to the patient screening process. The education of staff in the process of screening for depression and correct coding coupled with monitoring and staff feedback improved compliance with the identification and the documentation of patients with depression. Nurses were more likely than primary care providers to agree strongly that screening for depression enhances quality of care. Data gathered during this project support the integration of military and civilian nurse-facilitated screening for depression in the military primary care setting. The decision algorithm should be adapted and tested in other primary care environments.

  3. DR HAGIS-a fundus image database for the automatic extraction of retinal surface vessels from diabetic patients.

    PubMed

    Holm, Sven; Russell, Greg; Nourrit, Vincent; McLoughlin, Niall

    2017-01-01

    A database of retinal fundus images, the DR HAGIS database, is presented. This database consists of 39 high-resolution color fundus images obtained from a diabetic retinopathy screening program in the UK. The NHS screening program uses service providers that employ different fundus and digital cameras. This results in a range of different image sizes and resolutions. Furthermore, patients enrolled in such programs often display other comorbidities in addition to diabetes. Therefore, in an effort to replicate the normal range of images examined by grading experts during screening, the DR HAGIS database consists of images of varying image sizes and resolutions and four comorbidity subgroups: collectively defined as the diabetic retinopathy, hypertension, age-related macular degeneration, and Glaucoma image set (DR HAGIS). For each image, the vasculature has been manually segmented to provide a realistic set of images on which to test automatic vessel extraction algorithms. Modified versions of two previously published vessel extraction algorithms were applied to this database to provide some baseline measurements. A method based purely on the intensity of images pixels resulted in a mean segmentation accuracy of 95.83% ([Formula: see text]), whereas an algorithm based on Gabor filters generated an accuracy of 95.71% ([Formula: see text]).

  4. Application of Fragment Ion Information as Further Evidence in Probabilistic Compound Screening Using Bayesian Statistics and Machine Learning: A Leap Toward Automation.

    PubMed

    Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel

    2016-08-02

    In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.

  5. Toward comprehensive detection of sight threatening retinal disease using a multiscale AM-FM methodology

    NASA Astrophysics Data System (ADS)

    Agurto, C.; Barriga, S.; Murray, V.; Murillo, S.; Zamora, G.; Bauman, W.; Pattichis, M.; Soliz, P.

    2011-03-01

    In the United States and most of the western world, the leading causes of vision impairment and blindness are age-related macular degeneration (AMD), diabetic retinopathy (DR), and glaucoma. In the last decade, research in automatic detection of retinal lesions associated with eye diseases has produced several automatic systems for detection and screening of AMD, DR, and glaucoma. However. advanced, sight-threatening stages of DR and AMD can present with lesions not commonly addressed by current approaches to automatic screening. In this paper we present an automatic eye screening system based on multiscale Amplitude Modulation-Frequency Modulation (AM-FM) decompositions that addresses not only the early stages, but also advanced stages of retinal and optic nerve disease. Ten different experiments were performed in which abnormal features such as neovascularization, drusen, exudates, pigmentation abnormalities, geographic atrophy (GA), and glaucoma were classified. The algorithm achieved an accuracy detection range of [0.77 to 0.98] area under the ROC curve for a set of 810 images. When set to a specificity value of 0.60, the sensitivity of the algorithm to the detection of abnormal features ranged between 0.88 and 1.00. Our system demonstrates that, given an appropriate training set, it is possible to use a unique algorithm to detect a broad range of eye diseases.

  6. Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach

    PubMed Central

    Kudisthalert, Wasu

    2018-01-01

    Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets–Maximum Unbiased Validation Dataset–which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6. PMID:29652912

  7. Satellite Remote Sensing of Snow/Ice Albedo over the Himalayas

    NASA Technical Reports Server (NTRS)

    Hsu, N. Christina; Gautam, Ritesh

    2012-01-01

    The Himalayan glaciers and snowpacks play an important role in the hydrological cycle over Asia. The seasonal snow melt from the Himalayan glaciers and snowpacks is one of the key elements to the livelihood of the downstream densely populated regions of South Asia. During the pre-monsoon season (April-May-June), South Asia not only experiences the reversal of the regional meridional tropospheric temperature gradient (i.e., the onset of the summer monsoon), but also is being bombarded by dry westerly airmass that transports mineral dust from various Southwest Asian desert and arid regions into the Indo-Gangetic Plains in northern India. Mixed with heavy anthropogenic pollution, mineral dust constitutes the bulk of regional aerosol loading and forms an extensive and vertically extended brown haze lapping against the southern slopes of the Himalayas. Episodic dust plumes are advected over the Himalayas, and are discernible in satellite imagery, resulting in dust-capped snow surface. Motivated by the potential implications of accelerated snowmelt, we examine the changes in radiative energetics induced by aerosol transport over the Himalayan snow cover by utilizing space borne observations. Our objective lies in the investigation of potential impacts of aerosol solar absorption on the Top-of-Atmosphere (TOA) spectral reflectivity and the broadband albedo, and hence the accelerated snowmelt, particularly in the western Himalayas. Lambertian Equivalent Reflectivity (LER) in the visible and near-infrared wavelengths, derived from Moderate Resolution Imaging Spectroradiometer radiances, is used to generate statistics for determining perturbation caused due to dust layer over snow surface in over ten years of continuous observations. Case studies indicate significant reduction of LER ranging from 5 to 8% in the 412-860nm spectra. Broadband flux observations, from the Clouds and the Earth's Radiant Energy System, are also used to investigate changes in shortwave TOA flux over dust-laden and dust-free snow covered regions. Additionally, spatio-temporal and intra-seasonal variations of LER, along with snow cover information, are used to characterize the seasonal melt pattern and thus to distinguish the outstanding aerosol-induced snowmelt signal. Results from this observational work are expected to provide better understanding of the radiative impact of aerosols over snow surface, especially its role in the Himalayan hydro-glacialogical variability.

  8. A blood-based screening tool for Alzheimer's disease that spans serum and plasma: findings from TARC and ADNI.

    PubMed

    O'Bryant, Sid E; Xiao, Guanghua; Barber, Robert; Huebinger, Ryan; Wilhelmsen, Kirk; Edwards, Melissa; Graff-Radford, Neill; Doody, Rachelle; Diaz-Arrastia, Ramon

    2011-01-01

    There is no rapid and cost effective tool that can be implemented as a front-line screening tool for Alzheimer's disease (AD) at the population level. To generate and cross-validate a blood-based screener for AD that yields acceptable accuracy across both serum and plasma. Analysis of serum biomarker proteins were conducted on 197 Alzheimer's disease (AD) participants and 199 control participants from the Texas Alzheimer's Research Consortium (TARC) with further analysis conducted on plasma proteins from 112 AD and 52 control participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI). The full algorithm was derived from a biomarker risk score, clinical lab (glucose, triglycerides, total cholesterol, homocysteine), and demographic (age, gender, education, APOE*E4 status) data. Alzheimer's disease. 11 proteins met our criteria and were utilized for the biomarker risk score. The random forest (RF) biomarker risk score from the TARC serum samples (training set) yielded adequate accuracy in the ADNI plasma sample (training set) (AUC = 0.70, sensitivity (SN) = 0.54 and specificity (SP) = 0.78), which was below that obtained from ADNI cerebral spinal fluid (CSF) analyses (t-tau/Aβ ratio AUC = 0.92). However, the full algorithm yielded excellent accuracy (AUC = 0.88, SN = 0.75, and SP = 0.91). The likelihood ratio of having AD based on a positive test finding (LR+) = 7.03 (SE = 1.17; 95% CI = 4.49-14.47), the likelihood ratio of not having AD based on the algorithm (LR-) = 3.55 (SE = 1.15; 2.22-5.71), and the odds ratio of AD were calculated in the ADNI cohort (OR) = 28.70 (1.55; 95% CI = 11.86-69.47). It is possible to create a blood-based screening algorithm that works across both serum and plasma that provides a comparable screening accuracy to that obtained from CSF analyses.

  9. Dangerous Thresholds. Managing Escalation in the 21st Century

    DTIC Science & Technology

    2008-01-01

    Escalation in the 21st Century Forrest E . Morgan n Karl P. Mueller Evan S. Medeiros n Kevin L. Pollpeter n Roger Cliff Dangerous Thresholds The RAND...impacts of U.S. policy in the current security envi- ronment: War and Escalation in South Asia, by John E . Peters, James Dickens, Derek Eaton, C...Striking First: Preemptive and Preventive Attack in U.S. National Security Policy, by Karl P. Muel- ler, Jasen J. Castillo, Forrest E . Morgan, Negeen

  10. A Comparison Between the PLM and the MC68020 as Prolog Processors

    DTIC Science & Technology

    1988-01-01

    Continnt &OII P0111ter CP Memory X6_ofset(MP) A11ument Register 6 A6 Memory X7_ofset(MP) A11ument Register 7 A7 Memory X6_ofaet(MP) Tempor&ry Register 6...get_vuia.ble_Y iaput. Permeunt nria.ble Yi &Dd &rgumeat ~Jl8ler XJ output: fuDctioD move the content of Xj iato Yi get_va.na.ble_Y: move.! Xi.·4

  11. Turkish - American Relations Post 9/11

    DTIC Science & Technology

    2007-12-01

    Döneminde Türk-Amerikan İlişkileri,” Atatürk Araştirma Merkezi Dergisi 38 (1997): 1. 15 Yavuz Güler, “The Relationship between Turkey and USA in the...New Haven, Yale University, December 2000, pp. 315- 17. 16 Ibid, p. 237. 17 Mustafa Kayar, Turk Amerikan Iliskilerinde Irak Sorunu (Istanbul: IQ...Kultur Sanat Yayincilik, 2003), 106. 8 General Harbord visited Mustafa Kemal Ataturk in Sivas in 1919 to fulfill the requirements for conducting a

  12. Technology Assessment of the DACS/MERADCOM Prestaged Ammunition Loading System (PALS) Concept Study.

    DTIC Science & Technology

    1980-08-01

    American Railroads (AAR) and Coast 1-2 Arthurl) little.lnc - Guard (CC) regulations for the shipment of ammunition. The system must be compatible with...to meet Association of American Railroads (AAR) and Coast Guard (CC) regulations for the shipment of ammunition. The system must be compatible with...unloaded u ler field conditions? (4) Can the PALS meet AA and CC regulations for the safe shipment of ammunition in commercial cargo containers? (5

  13. Transcriptomic and Physiological Variations of Three Arabidopsis Ecotypes in Response to Salt Stress

    PubMed Central

    Wang, Yanping; Yang, Li; Zheng, Zhimin; Grumet, Rebecca; Loescher, Wayne; Zhu, Jian-Kang; Yang, Pingfang; Hu, Yuanlei; Chan, Zhulong

    2013-01-01

    Salt stress is one of the major abiotic stresses in agriculture worldwide. Analysis of natural genetic variation in Arabidopsis is an effective approach to characterize candidate salt responsive genes. Differences in salt tolerance of three Arabidopsis ecotypes were compared in this study based on their responses to salt treatments at two developmental stages: seed germination and later growth. The Sha ecotype had higher germination rates, longer roots and less accumulation of superoxide radical and hydrogen peroxide than the Ler and Col ecotypes after short term salt treatment. With long term salt treatment, Sha exhibited higher survival rates and lower electrolyte leakage. Transcriptome analysis revealed that many genes involved in cell wall, photosynthesis, and redox were mainly down-regulated by salinity effects, while transposable element genes, microRNA and biotic stress related genes were significantly changed in comparisons of Sha vs. Ler and Sha vs. Col. Several pathways involved in tricarboxylic acid cycle, hormone metabolism and development, and the Gene Ontology terms involved in response to stress and defense response were enriched after salt treatment, and between Sha and other two ecotypes. Collectively, these results suggest that the Sha ecotype is preconditioned to withstand abiotic stress. Further studies about detailed gene function are needed. These comparative transcriptomic and analytical results also provide insight into the complexity of salt stress tolerance mechanisms. PMID:23894403

  14. Line-width roughness of advanced semiconductor features by using FIB and planar-TEM as reference metrology

    NASA Astrophysics Data System (ADS)

    Takamasu, Kiyoshi; Takahashi, Satoru; Kawada, Hiroki; Ikota, Masami

    2018-03-01

    LER (Line Edge Roughness) and LWR (Line Width Roughness) of the semiconductor device are an important evaluation scale of the performance of the device. Conventionally, LER and LWR is evaluated from CD-SEM (Critical Dimension Scanning Electron Microscope) images. However, CD-SEM measurement has a problem that high frequency random noise is large, and resolution is not sufficiently high. For random noise of CD-SEM measurement, some techniques are proposed. In these methods, it is necessary to set parameters for model and processing, and it is necessary to verify the correctness of these parameters using reference metrology. We have already proposed a novel reference metrology using FIB (Focused Ion Beam) process and planar-TEM (Transmission Electron Microscope) method. In this study, we applied the proposed method to three new samples such as SAQP (Self-Aligned Quadruple Patterning) FinFET device, EUV (Extreme Ultraviolet Lithography) conventional resist, and EUV new material resist. LWR and PSD (Power Spectral Density) of LWR are calculated from the edge positions on planar-TEM images. We confirmed that LWR and PSD of LWR can be measured with high accuracy and evaluated the difference by the proposed method. Furthermore, from comparisons with PSD of the same sample by CD-SEM, the validity of measurement of PSD and LWR by CD-SEM can be verified.

  15. Performance evaluation of nonchemically amplified negative tone photoresists for e-beam and EUV lithography

    NASA Astrophysics Data System (ADS)

    Singh, Vikram; Satyanarayana, Vardhineedi Sri Venkata; Batina, Nikola; Reyes, Israel Morales; Sharma, Satinder K.; Kessler, Felipe; Scheffer, Francine R.; Weibel, Daniel E.; Ghosh, Subrata; Gonsalves, Kenneth E.

    2014-10-01

    Although extreme ultraviolet (EUV) lithography is being considered as one of the most promising next-generation lithography techniques for patterning sub-20 nm features, the development of suitable EUV resists remains one of the main challenges confronting the semiconductor industry. The goal is to achieve sub-20 nm line patterns having low line edge roughness (LER) of <1.8 nm and a sensitivity of 5 to 20 mJ/cm2. The present work demonstrates the lithographic performance of two nonchemically amplified (n-CARs) negative photoresists, MAPDST homopolymer and MAPDST-MMA copolymer, prepared from suitable monomers containing the radiation sensitive sulfonium functionality. Investigations into the effect of several process parameters are reported. These include spinning conditions to obtain film thicknesses <50 nm, baking regimes, exposure conditions, and the resulting surface topographies. The effect of these protocols on sensitivity, contrast, and resolution has been assessed for the optimization of 20 nm features and the corresponding LER/line width roughness. These n-CARs have also been found to possess high etch resistance. The etch durability of MAPDST homopolymer and MAPDST-MMA copolymer (under SF6 plasma chemistry) with respect to the silicon substrate are 7.2∶1 and 8.3∶1, respectively. This methodical investigation will provide guidance in designing new resist materials with improved efficiency for EUVL through polymer microstructure engineering.

  16. Performance-based alternative assessments as a means of eliminating gender achievement differences on science tests

    NASA Astrophysics Data System (ADS)

    Brown, Norman Merrill

    1998-09-01

    Historically, researchers have reported an achievement difference between females and males on standardized science tests. These differences have been reported to be based upon science knowledge, abstract reasoning skills, mathematical abilities, and cultural and social phenomena. This research was designed to determine how mastery of specific science content from public school curricula might be evaluated with performance-based assessment models, without producing gender achievement differences. The assessment instruments used were Harcourt Brace Educational Measurement's GOALSsp°ler: A Performance-Based Measure of Achievement and the performance-based portion of the Stanford Achievement Testspcopyright, Ninth Edition. The identified independent variables were test, gender, ethnicity, and grade level. A 2 x 2 x 6 x 12 (test x gender x ethnicity x grade) factorial experimental design was used to organize the data. A stratified random sample (N = 2400) was selected from a national pool of norming data: N = 1200 from the GOALSsp°ler group and N = 1200 from the SAT9spcopyright group. The ANOVA analysis yielded mixed results. The factors of test, gender, ethnicity by grade, gender by grade, and gender by grade by ethnicity failed to produce significant results (alpha = 0.05). The factors yielding significant results were ethnicity, grade, and ethnicity by grade. Therefore, no significant differences were found between female and male achievement on these performance-based assessments.

  17. The Abort Kicker System for the PEP-II Storage Rings at SLAC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delamare, Jeffrey E

    2003-06-20

    The PEP-II project has two storage rings. The HER (High Energy Ring) has up to 1.48 A of election beam at 9 GeV, and the LER (Low Energy Ring) has up to 2.14 A of positron beam at 3.1 GeV. To protect the HER and LER beam lines in the event of a ring component failure, each ring has an abort kicker system which directs the beam into a dump when a failure is detected. Due to the high current of the beams, the beam kick is tapered from 100% to 80% in 7.33 {micro}S (the beam transit time aroundmore » the ring). This taper distributes the energy evenly across the window which separates the ring from the beam dump such that the window is not damaged. The abort kicker trigger is synchronized with the ion clearing gap of the beam allowing for the kicker field to rise from 0-80% while there is no beam in the kicker magnet. Originally the kicker system was designed for a rise time of 370nS [1], but because the ion clearing gap was reduced in half, so was the rise time requirement for the kicker. This report discusses the design of the system interlocks, diagnostics, and modulator with the modifications necessary to accommodate an ion clearing gap of 185nS.« less

  18. Distant sequences determine 5′ end formation of cox3 transcripts in Arabidopsis thaliana ecotype C24

    PubMed Central

    Forner, Joachim; Weber, Bärbel; Wiethölter, Caterina; Meyer, Rhonda C.; Binder, Stefan

    2005-01-01

    The genomic environments and the transcripts of the mitochondrial cox3 gene are investigated in three Arabidopsis thaliana ecotypes. While the proximate 5′ sequences up to nucleotide position −584, the coding regions and the 3′ flanking regions are identical in Columbia (Col), C24 and Landsberg erecta (Ler), genomic variation is detected in regions further upstream. In the mitochondrial DNA of Col, a 1790 bp fragment flanked by a nonanucleotide direct repeat is present beyond position −584 with respect to the ATG. While in Ler only part of this insertion is conserved, this sequence is completely absent in C24, except for a single copy of the nonanucleotide direct repeat. Northern hybridization reveals identical major transcripts in the three ecotypes, but identifies an additional abundant 60 nt larger mRNA species in C24. The extremities of the most abundant mRNA species are identical in the three ecotypes. In C24, an extra major 5′ end is abundant. This terminus and the other major 5′ ends are located in identical sequence regions. Inspection of Atcox3 transcripts in C24/Col hybrids revealed a female inheritance of the mRNA species with the extra 5′ terminus. Thus, a mitochondrially encoded factor determines the generation of an extra 5′ mRNA end. PMID:16107557

  19. 4th generation HIV screening in Massachusetts: a partnership between laboratory and program.

    PubMed

    Goodhue, Tammy; Kazianis, Arthur; Werner, Barbara G; Stiles, Tracy; Callis, Barry P; Dawn Fukuda, H; Cranston, Kevin

    2013-12-01

    The Massachusetts Department of Public Health's (MDPH) Office of HIV/AIDS (OHA) and Hinton State Laboratory Institute (HSLI) have offered HIV screening since 1985. Point-of-care screening and serum collection for laboratory-based testing is conducted at clinic and non-clinic-based sites across Massachusetts as part of an integrated communicable disease screening intervention. MDPH aimed to transition to a 4th generation HIV screening-based algorithm for testing all serum specimens collected at OHA-funded programs and submitted to the HSLI to detect acute HIV infections, detect and differentiate HIV-1 and HIV-2 infections, eliminate indeterminate results, reduce cost and turnaround time, and link newly diagnosed HIV+ individuals to care. The HSLI and OHA created a joint project management team to plan and lead the transition. The laboratory transitioned successfully to a 4th generation screening assay as part of a revised diagnostic algorithm. In the 12 months since implementation, a total of 7984 serum specimens were tested with 258 (3.2%) positive for HIV-1 and one positive for HIV-2. Eight were reported as acute HIV-1 infections. These individuals were linked to medical care and partner services in a timely manner. Turnaround time was reduced and the laboratory realized an overall cost savings of approximately 15%. The identification of eight acute HIV infections in the first year underscores the importance of using the most sensitive screening tests available. A multi-disciplinary program and laboratory team was critical to the success of the transition, and the lessons learned may be useful for other jurisdictions. Published by Elsevier B.V.

  20. Optimizing urine drug testing for monitoring medication compliance in pain management.

    PubMed

    Melanson, Stacy E F; Ptolemy, Adam S; Wasan, Ajay D

    2013-12-01

    It can be challenging to successfully monitor medication compliance in pain management. Clinicians and laboratorians need to collaborate to optimize patient care and maximize operational efficiency. The test menu, assay cutoffs, and testing algorithms utilized in the urine drug testing panels should be periodically reviewed and tailored to the patient population to effectively assess compliance and avoid unnecessary testing and cost to the patient. Pain management and pathology collaborated on an important quality improvement initiative to optimize urine drug testing for monitoring medication compliance in pain management. We retrospectively reviewed 18 months of data from our pain management center. We gathered data on test volumes, positivity rates, and the frequency of false positive results. We also reviewed the clinical utility of our testing algorithms, assay cutoffs, and adulterant panel. In addition, the cost of each component was calculated. The positivity rate for ethanol and 3,4-methylenedioxymethamphetamine were <1% so we eliminated this testing from our panel. We also lowered the screening cutoff for cocaine to meet the clinical needs of the pain management center. In addition, we changed our testing algorithm for 6-acetylmorphine, benzodiazepines, and methadone. For example, due the high rate of false negative results using our immunoassay-based benzodiazepine screen, we removed the screening portion of the algorithm and now perform benzodiazepine confirmation up front in all specimens by liquid chromatography-tandem mass spectrometry. Conducting an interdisciplinary quality improvement project allowed us to optimize our testing panel for monitoring medication compliance in pain management and reduce cost. Wiley Periodicals, Inc.

  1. High content analysis of phagocytic activity and cell morphology with PuntoMorph.

    PubMed

    Al-Ali, Hassan; Gao, Han; Dalby-Hansen, Camilla; Peters, Vanessa Ann; Shi, Yan; Brambilla, Roberta

    2017-11-01

    Phagocytosis is essential for maintenance of normal homeostasis and healthy tissue. As such, it is a therapeutic target for a wide range of clinical applications. The development of phenotypic screens targeting phagocytosis has lagged behind, however, due to the difficulties associated with image-based quantification of phagocytic activity. We present a robust algorithm and cell-based assay system for high content analysis of phagocytic activity. The method utilizes fluorescently labeled beads as a phagocytic substrate with defined physical properties. The algorithm employs statistical modeling to determine the mean fluorescence of individual beads within each image, and uses the information to conduct an accurate count of phagocytosed beads. In addition, the algorithm conducts detailed and sophisticated analysis of cellular morphology, making it a standalone tool for high content screening. We tested our assay system using microglial cultures. Our results recapitulated previous findings on the effects of microglial stimulation on cell morphology and phagocytic activity. Moreover, our cell-level analysis revealed that the two phenotypes associated with microglial activation, specifically cell body hypertrophy and increased phagocytic activity, are not highly correlated. This novel finding suggests the two phenotypes may be under the control of distinct signaling pathways. We demonstrate that our assay system outperforms preexisting methods for quantifying phagocytic activity in multiple dimensions including speed, accuracy, and resolution. We provide a framework to facilitate the development of high content assays suitable for drug screening. For convenience, we implemented our algorithm in a standalone software package, PuntoMorph. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. First trimester PAPP-A in the detection of non-Down syndrome aneuploidy.

    PubMed

    Ochshorn, Y; Kupferminc, M J; Wolman, I; Orr-Urtreger, A; Jaffa, A J; Yaron, Y

    2001-07-01

    Combined first trimester screening using pregnancy associated plasma protein-A (PAPP-A), free beta-human chorionic gonadotrophin, and nuchal translucency (NT), is currently accepted as probably the best combination for the detection of Down syndrome (DS). Current first trimester algorithms provide computed risks only for DS. However, low PAPP-A is also associated with other chromosome anomalies such as trisomy 13, 18, and sex chromosome aneuploidy. Thus, using currently available algorithms, some chromosome anomalies may not be detected. The purpose of the present study was to establish a low-end cut-off value for PAPP-A that would increase the detection rates for non-DS chromosome anomalies. The study included 1408 patients who underwent combined first trimester screening. To determine a low-end cut-off value for PAPP-A, a Receiver-Operator Characteristic (ROC) curve analysis was performed. In the entire study group there were 18 cases of chromosome anomalies (trisomy 21, 13, 18, sex chromosome anomalies), 14 of which were among screen-positive patients, a detection rate of 77.7% for all chromosome anomalies (95% CI: 55.7-99.7%). ROC curve analysis detected a statistically significant cut-off for PAPP-A at 0.25 MoM. If the definition of screen-positive were to also include patients with PAPP-A<0.25 MoM, the detection rate would increase to 88.8% for all chromosome anomalies (95% CI: 71.6-106%). This low cut-off value may be used until specific algorithms are implemented for non-Down syndrome aneuploidy. Copyright 2001 John Wiley & Sons, Ltd.

  3. Automatic Lung-RADS™ classification with a natural language processing system.

    PubMed

    Beyer, Sebastian E; McKee, Brady J; Regis, Shawn M; McKee, Andrea B; Flacke, Sebastian; El Saadawi, Gilan; Wald, Christoph

    2017-09-01

    Our aim was to train a natural language processing (NLP) algorithm to capture imaging characteristics of lung nodules reported in a structured CT report and suggest the applicable Lung-RADS™ (LR) category. Our study included structured, clinical reports of consecutive CT lung screening (CTLS) exams performed from 08/2014 to 08/2015 at an ACR accredited Lung Cancer Screening Center. All patients screened were at high-risk for lung cancer according to the NCCN Guidelines ® . All exams were interpreted by one of three radiologists credentialed to read CTLS exams using LR using a standard reporting template. Training and test sets consisted of consecutive exams. Lung screening exams were divided into two groups: three training sets (500, 120, and 383 reports each) and one final evaluation set (498 reports). NLP algorithm results were compared with the gold standard of LR category assigned by the radiologist. The sensitivity/specificity of the NLP algorithm to correctly assign LR categories for suspicious nodules (LR 4) and positive nodules (LR 3/4) were 74.1%/98.6% and 75.0%/98.8% respectively. The majority of mismatches occurred in cases where pulmonary findings were present not currently addressed by LR. Misclassifications also resulted from the failure to identify exams as follow-up and the failure to completely characterize part-solid nodules. In a sub-group analysis among structured reports with standardized language, the sensitivity and specificity to detect LR 4 nodules were 87.0% and 99.5%, respectively. An NLP system can accurately suggest the appropriate LR category from CTLS exam findings when standardized reporting is used.

  4. Automatic Lung-RADS™ classification with a natural language processing system

    PubMed Central

    Beyer, Sebastian E.; Regis, Shawn M.; McKee, Andrea B.; Flacke, Sebastian; El Saadawi, Gilan; Wald, Christoph

    2017-01-01

    Background Our aim was to train a natural language processing (NLP) algorithm to capture imaging characteristics of lung nodules reported in a structured CT report and suggest the applicable Lung-RADS™ (LR) category. Methods Our study included structured, clinical reports of consecutive CT lung screening (CTLS) exams performed from 08/2014 to 08/2015 at an ACR accredited Lung Cancer Screening Center. All patients screened were at high-risk for lung cancer according to the NCCN Guidelines®. All exams were interpreted by one of three radiologists credentialed to read CTLS exams using LR using a standard reporting template. Training and test sets consisted of consecutive exams. Lung screening exams were divided into two groups: three training sets (500, 120, and 383 reports each) and one final evaluation set (498 reports). NLP algorithm results were compared with the gold standard of LR category assigned by the radiologist. Results The sensitivity/specificity of the NLP algorithm to correctly assign LR categories for suspicious nodules (LR 4) and positive nodules (LR 3/4) were 74.1%/98.6% and 75.0%/98.8% respectively. The majority of mismatches occurred in cases where pulmonary findings were present not currently addressed by LR. Misclassifications also resulted from the failure to identify exams as follow-up and the failure to completely characterize part-solid nodules. In a sub-group analysis among structured reports with standardized language, the sensitivity and specificity to detect LR 4 nodules were 87.0% and 99.5%, respectively. Conclusions An NLP system can accurately suggest the appropriate LR category from CTLS exam findings when standardized reporting is used. PMID:29221286

  5. Operationalizing hippocampal volume as an enrichment biomarker for amnestic MCI trials: effect of algorithm, test-retest variability and cut-point on trial cost, duration and sample size

    PubMed Central

    Yu, P.; Sun, J.; Wolz, R.; Stephenson, D.; Brewer, J.; Fox, N.C.; Cole, P.E.; Jack, C.R.; Hill, D.L.G.; Schwarz, A.J.

    2014-01-01

    Objective To evaluate the effect of computational algorithm, measurement variability and cut-point on hippocampal volume (HCV)-based patient selection for clinical trials in mild cognitive impairment (MCI). Methods We used normal control and amnestic MCI subjects from ADNI-1 as normative reference and screening cohorts. We evaluated the enrichment performance of four widely-used hippocampal segmentation algorithms (FreeSurfer, HMAPS, LEAP and NeuroQuant) in terms of two-year changes in MMSE, ADAS-Cog and CDR-SB. We modeled the effect of algorithm, test-retest variability and cut-point on sample size, screen fail rates and trial cost and duration. Results HCV-based patient selection yielded not only reduced sample sizes (by ~40–60%) but also lower trial costs (by ~30–40%) across a wide range of cut-points. Overall, the dependence on the cut-point value was similar for the three clinical instruments considered. Conclusion These results provide a guide to the choice of HCV cut-point for aMCI clinical trials, allowing an informed trade-off between statistical and practical considerations. PMID:24211008

  6. Design and algorithm research of high precision airborne infrared touch screen

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Bing; Wang, Shuang-Jie; Fu, Yan; Chen, Zhao-Quan

    2016-10-01

    There are shortcomings of low precision, touch shaking, and sharp decrease of touch precision when emitting and receiving tubes are failure in the infrared touch screen. A high precision positioning algorithm based on extended axis is proposed to solve these problems. First, the unimpeded state of the beam between emitting and receiving tubes is recorded as 0, while the impeded state is recorded as 1. Then, the method of oblique scan is used, in which the light of one emitting tube is used for five receiving tubes. The impeded information of all emitting and receiving tubes is collected as matrix. Finally, according to the method of arithmetic average, the position of the touch object is calculated. The extended axis positioning algorithm is characteristic of high precision in case of failure of individual infrared tube and affects slightly the precision. The experimental result shows that the 90% display area of the touch error is less than 0.25D, where D is the distance between adjacent emitting tubes. The conclusion is gained that the algorithm based on extended axis has advantages of high precision, little impact when individual infrared tube is failure, and using easily.

  7. Retinal image quality assessment based on image clarity and content

    NASA Astrophysics Data System (ADS)

    Abdel-Hamid, Lamiaa; El-Rafei, Ahmed; El-Ramly, Salwa; Michelson, Georg; Hornegger, Joachim

    2016-09-01

    Retinal image quality assessment (RIQA) is an essential step in automated screening systems to avoid misdiagnosis caused by processing poor quality retinal images. A no-reference transform-based RIQA algorithm is introduced that assesses images based on five clarity and content quality issues: sharpness, illumination, homogeneity, field definition, and content. Transform-based RIQA algorithms have the advantage of considering retinal structures while being computationally inexpensive. Wavelet-based features are proposed to evaluate the sharpness and overall illumination of the images. A retinal saturation channel is designed and used along with wavelet-based features for homogeneity assessment. The presented sharpness and illumination features are utilized to assure adequate field definition, whereas color information is used to exclude nonretinal images. Several publicly available datasets of varying quality grades are utilized to evaluate the feature sets resulting in area under the receiver operating characteristic curve above 0.99 for each of the individual feature sets. The overall quality is assessed by a classifier that uses the collective features as an input vector. The classification results show superior performance of the algorithm in comparison to other methods from literature. Moreover, the algorithm addresses efficiently and comprehensively various quality issues and is suitable for automatic screening systems.

  8. Halftoning processing on a JPEG-compressed image

    NASA Astrophysics Data System (ADS)

    Sibade, Cedric; Barizien, Stephane; Akil, Mohamed; Perroton, Laurent

    2003-12-01

    Digital image processing algorithms are usually designed for the raw format, that is on an uncompressed representation of the image. Therefore prior to transforming or processing a compressed format, decompression is applied; then, the result of the processing application is finally re-compressed for further transfer or storage. The change of data representation is resource-consuming in terms of computation, time and memory usage. In the wide format printing industry, this problem becomes an important issue: e.g. a 1 m2 input color image, scanned at 600 dpi exceeds 1.6 GB in its raw representation. However, some image processing algorithms can be performed in the compressed-domain, by applying an equivalent operation on the compressed format. This paper is presenting an innovative application of the halftoning processing operation by screening, to be applied on JPEG-compressed image. This compressed-domain transform is performed by computing the threshold operation of the screening algorithm in the DCT domain. This algorithm is illustrated by examples for different halftone masks. A pre-sharpening operation, applied on a JPEG-compressed low quality image is also described; it allows to de-noise and to enhance the contours of this image.

  9. Edge-directed inference for microaneurysms detection in digital fundus images

    NASA Astrophysics Data System (ADS)

    Huang, Ke; Yan, Michelle; Aviyente, Selin

    2007-03-01

    Microaneurysms (MAs) detection is a critical step in diabetic retinopathy screening, since MAs are the earliest visible warning of potential future problems. A variety of algorithms have been proposed for MAs detection in mass screening. Different methods have been proposed for MAs detection. The core technology for most of existing methods is based on a directional mathematical morphological operation called "Top-Hat" filter that requires multiple filtering operations at each pixel. Background structure, uneven illumination and noise often cause confusion between MAs and some non-MA structures and limits the applicability of the filter. In this paper, a novel detection framework based on edge directed inference is proposed for MAs detection. The candidate MA regions are first delineated from the edge map of a fundus image. Features measuring shape, brightness and contrast are extracted for each candidate MA region to better exclude false detection from true MAs. Algorithmic analysis and empirical evaluation reveal that the proposed edge directed inference outperforms the "Top-Hat" based algorithm in both detection accuracy and computational speed.

  10. Optimal structure and parameter learning of Ising models

    DOE PAGES

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...

    2018-03-16

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  11. Optimal structure and parameter learning of Ising models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  12. Performance of rapid tests and algorithms for HIV screening in Abidjan, Ivory Coast.

    PubMed

    Loukou, Y G; Cabran, M A; Yessé, Zinzendorf Nanga; Adouko, B M O; Lathro, S J; Agbessi-Kouassi, K B T

    2014-01-01

    Seven rapid diagnosis tests (RDTs) of HIV were evaluated by a panel group who collected serum samples from patients in Abidjan (HIV-1 = 203, HIV-2 = 25, HIV-dual = 25, HIV = 305). Kit performances were recorded after the reference techniques (enzyme-linked immunosorbent assay). The following RDTs showed a sensitivity of 100% and a specificity higher than 99%: Determine, Oraquick, SD Bioline, BCP, and Stat-Pak. These kits were used to establish infection screening strategies. The combination with 2 or 3 of these tests in series or parallel algorithms showed that series combinations with 2 tests (Oraquick and Bioline) and 3 tests (Determine, BCP, and Stat-Pak) gave the best performances (sensitivity, specificity, positive predictive value, and negative predictive value of 100%). However, the combination with 2 tests appeared to be more onerous than the combination with 3 tests. The combination with Determine, BCP, and Stat-Pak tests serving as a tiebreaker could be an alternative to the HIV/AIDS serological screening in Abidjan.

  13. Benchmarking methods and data sets for ligand enrichment assessment in virtual screening.

    PubMed

    Xia, Jie; Tilahun, Ermias Lemma; Reid, Terry-Elinor; Zhang, Liangren; Wang, Xiang Simon

    2015-01-01

    Retrospective small-scale virtual screening (VS) based on benchmarking data sets has been widely used to estimate ligand enrichments of VS approaches in the prospective (i.e. real-world) efforts. However, the intrinsic differences of benchmarking sets to the real screening chemical libraries can cause biased assessment. Herein, we summarize the history of benchmarking methods as well as data sets and highlight three main types of biases found in benchmarking sets, i.e. "analogue bias", "artificial enrichment" and "false negative". In addition, we introduce our recent algorithm to build maximum-unbiased benchmarking sets applicable to both ligand-based and structure-based VS approaches, and its implementations to three important human histone deacetylases (HDACs) isoforms, i.e. HDAC1, HDAC6 and HDAC8. The leave-one-out cross-validation (LOO CV) demonstrates that the benchmarking sets built by our algorithm are maximum-unbiased as measured by property matching, ROC curves and AUCs. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Benchmarking Methods and Data Sets for Ligand Enrichment Assessment in Virtual Screening

    PubMed Central

    Xia, Jie; Tilahun, Ermias Lemma; Reid, Terry-Elinor; Zhang, Liangren; Wang, Xiang Simon

    2014-01-01

    Retrospective small-scale virtual screening (VS) based on benchmarking data sets has been widely used to estimate ligand enrichments of VS approaches in the prospective (i.e. real-world) efforts. However, the intrinsic differences of benchmarking sets to the real screening chemical libraries can cause biased assessment. Herein, we summarize the history of benchmarking methods as well as data sets and highlight three main types of biases found in benchmarking sets, i.e. “analogue bias”, “artificial enrichment” and “false negative”. In addition, we introduced our recent algorithm to build maximum-unbiased benchmarking sets applicable to both ligand-based and structure-based VS approaches, and its implementations to three important human histone deacetylase (HDAC) isoforms, i.e. HDAC1, HDAC6 and HDAC8. The Leave-One-Out Cross-Validation (LOO CV) demonstrates that the benchmarking sets built by our algorithm are maximum-unbiased in terms of property matching, ROC curves and AUCs. PMID:25481478

  15. A search for H/ACA snoRNAs in yeast using MFE secondary structure prediction.

    PubMed

    Edvardsson, Sverker; Gardner, Paul P; Poole, Anthony M; Hendy, Michael D; Penny, David; Moulton, Vincent

    2003-05-01

    Noncoding RNA genes produce functional RNA molecules rather than coding for proteins. One such family is the H/ACA snoRNAs. Unlike the related C/D snoRNAs these have resisted automated detection to date. We develop an algorithm to screen the yeast genome for novel H/ACA snoRNAs. To achieve this, we introduce some new methods for facilitating the search for noncoding RNAs in genomic sequences which are based on properties of predicted minimum free-energy (MFE) secondary structures. The algorithm has been implemented and can be generalized to enable screening of other eukaryote genomes. We find that use of primary sequence alone is insufficient for identifying novel H/ACA snoRNAs. Only the use of secondary structure filters reduces the number of candidates to a manageable size. From genomic context, we identify three strong H/ACA snoRNA candidates. These together with a further 47 candidates obtained by our analysis are being experimentally screened.

  16. Vision Algorithms Catch Defects in Screen Displays

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Andrew Watson, a senior scientist at Ames Research Center, developed a tool called the Spatial Standard Observer (SSO), which models human vision for use in robotic applications. Redmond, Washington-based Radiant Zemax LLC licensed the technology from NASA and combined it with its imaging colorimeter system, creating a powerful tool that high-volume manufacturers of flat-panel displays use to catch defects in screens.

  17. A ranking method for the concurrent learning of compounds with various activity profiles.

    PubMed

    Dörr, Alexander; Rosenbaum, Lars; Zell, Andreas

    2015-01-01

    In this study, we present a SVM-based ranking algorithm for the concurrent learning of compounds with different activity profiles and their varying prioritization. To this end, a specific labeling of each compound was elaborated in order to infer virtual screening models against multiple targets. We compared the method with several state-of-the-art SVM classification techniques that are capable of inferring multi-target screening models on three chemical data sets (cytochrome P450s, dehydrogenases, and a trypsin-like protease data set) containing three different biological targets each. The experiments show that ranking-based algorithms show an increased performance for single- and multi-target virtual screening. Moreover, compounds that do not completely fulfill the desired activity profile are still ranked higher than decoys or compounds with an entirely undesired profile, compared to other multi-target SVM methods. SVM-based ranking methods constitute a valuable approach for virtual screening in multi-target drug design. The utilization of such methods is most helpful when dealing with compounds with various activity profiles and the finding of many ligands with an already perfectly matching activity profile is not to be expected.

  18. Reproducibility of risk figures in 2nd-trimester maternal serum screening for down syndrome: comparison of 2 laboratories.

    PubMed

    Benn, Peter A; Makowski, Gregory S; Egan, James F X; Wright, Dave

    2006-11-01

    Analytical error affects 2nd-trimester maternal serum screening for Down syndrome risk estimation. We analyzed the between-laboratory reproducibility of risk estimates from 2 laboratories. Laboratory 1 used Bayer ACS180 immunoassays for alpha-fetoprotein (AFP) and human chorionic gonadotropin (hCG), Diagnostic Systems Laboratories (DSL) RIA for unconjugated estriol (uE3), and DSL enzyme immunoassay for inhibin-A (INH-A). Laboratory 2 used Beckman immunoassays for AFP, hCG, and uE3, and DSL enzyme immunoassay for INH-A. Analyte medians were separately established for each laboratory. We used the same computational algorithm for all risk calculations, and we used Monte Carlo methods for computer modeling. For 462 samples tested, risk figures from the 2 laboratories differed >2-fold for 44.7%, >5-fold for 7.1%, and >10-fold for 1.7%. Between-laboratory differences in analytes were greatest for uE3 and INH-A. The screen-positive rates were 9.3% for laboratory 1 and 11.5% for laboratory 2, with a significant difference in the patients identified as screen-positive vs screen-negative (McNemar test, P<0.001). Computer modeling confirmed the large between-laboratory risk differences. Differences in performance of assays and laboratory procedures can have a large effect on patient-specific risks. Screening laboratories should minimize test imprecision and ensure that each assay performs in a manner similar to that assumed in the risk computational algorithm.

  19. Medical follow-up of workers exposed to lung carcinogens: French evidence-based and pragmatic recommendations.

    PubMed

    Delva, Fleur; Margery, Jacques; Laurent, François; Petitprez, Karine; Pairon, Jean-Claude

    2017-02-14

    The aim of this work was to establish recommendations for the medical follow-up of workers currently or previously exposed to lung carcinogens. A critical synthesis of the literature was conducted. Occupational lung carcinogenic substances were listed and classified according to their level of lung cancer risk. A targeted screening protocol was defined. A clinical trial, National Lung Screnning Trial (NLST), showed the efficacy of chest CAT scan (CT) screening for populations of smokers aged 55-74 years with over 30 pack-years of exposure who had stopped smoking for less than 15 years. To propose screening in accordance with NLST criteria, and to account for occupational risk factors, screening among smokers and former smokers needs to consider the types of occupational exposure for which the risk level is at least equivalent to the risk of the subjects included in the NLST. The working group proposes an algorithm that estimates the relative risk of each occupational lung carcinogen, taking into account exposure to tobacco, based on available data from the literature. Given the lack of data on bronchopulmonary cancer (BPC) screening in occupationally exposed workers, the working group proposed implementing a screening experiment for bronchopulmonary cancer in subjects occupationally exposed or having been occupationally exposed to lung carcinogens who are confirmed as having high risk factors for BPC. A specific algorithm is proposed to determine the level of risk of BPC, taking into account the different occupational lung carcinogens and tobacco smoking at the individual level.

  20. Algorithm of reducing the false positives in IDS based on correlation Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Jianyi; Li, Sida; Zhang, Ru

    2018-03-01

    This paper proposes an algorithm of reducing the false positives in IDS based on correlation Analysis. Firstly, the algorithm analyzes the distinguishing characteristics of false positives and real alarms, and preliminary screen the false positives; then use the method of attribute similarity clustering to the alarms and further reduces the amount of alarms; finally, according to the characteristics of multi-step attack, associated it by the causal relationship. The paper also proposed a reverse causation algorithm based on the attack association method proposed by the predecessors, turning alarm information into a complete attack path. Experiments show that the algorithm simplifies the number of alarms, improve the efficiency of alarm processing, and contribute to attack purposes identification and alarm accuracy improvement.

  1. Improved artificial bee colony algorithm based gravity matching navigation method.

    PubMed

    Gao, Wei; Zhao, Bo; Zhou, Guang Tao; Wang, Qiu Ying; Yu, Chun Yang

    2014-07-18

    Gravity matching navigation algorithm is one of the key technologies for gravity aided inertial navigation systems. With the development of intelligent algorithms, the powerful search ability of the Artificial Bee Colony (ABC) algorithm makes it possible to be applied to the gravity matching navigation field. However, existing search mechanisms of basic ABC algorithms cannot meet the need for high accuracy in gravity aided navigation. Firstly, proper modifications are proposed to improve the performance of the basic ABC algorithm. Secondly, a new search mechanism is presented in this paper which is based on an improved ABC algorithm using external speed information. At last, modified Hausdorff distance is introduced to screen the possible matching results. Both simulations and ocean experiments verify the feasibility of the method, and results show that the matching rate of the method is high enough to obtain a precise matching position.

  2. Improved Artificial Bee Colony Algorithm Based Gravity Matching Navigation Method

    PubMed Central

    Gao, Wei; Zhao, Bo; Zhou, Guang Tao; Wang, Qiu Ying; Yu, Chun Yang

    2014-01-01

    Gravity matching navigation algorithm is one of the key technologies for gravity aided inertial navigation systems. With the development of intelligent algorithms, the powerful search ability of the Artificial Bee Colony (ABC) algorithm makes it possible to be applied to the gravity matching navigation field. However, existing search mechanisms of basic ABC algorithms cannot meet the need for high accuracy in gravity aided navigation. Firstly, proper modifications are proposed to improve the performance of the basic ABC algorithm. Secondly, a new search mechanism is presented in this paper which is based on an improved ABC algorithm using external speed information. At last, modified Hausdorff distance is introduced to screen the possible matching results. Both simulations and ocean experiments verify the feasibility of the method, and results show that the matching rate of the method is high enough to obtain a precise matching position. PMID:25046019

  3. Validation of Version 3.0 of the Breast Cancer Genetics Referral Screening Tool (B-RST™).

    PubMed

    Bellcross, Cecelia; Hermstad, April; Tallo, Christine; Stanislaw, Christine

    2018-05-08

    Despite increased awareness of hereditary breast and ovarian cancer among clinicians and the public, many BRCA1/2 mutation carriers remain unaware of their risk status. The Breast Cancer Genetics Referral Screening Tool (B-RST™) was created and validated to easily identify individuals at increased risk for hereditary breast and ovarian cancer for referral to cancer genetics services. The purpose of this study was to revise B-RST™ to maximize sensitivity against BRCA1/2 mutation status. We analyzed pedigrees of 277 individuals who had undergone BRCA1/2 testing to determine modifications to the B-RST™ 2.0 algorithm that would maximize sensitivity for mutations, while maintaining simplicity. We used McNemar's chi-square test to compare validation measures between the revised version (3.0) and the 2.0 version. Algorithmic changes made to B-RST™ 2.0 increased the sensitivity against BRCA1/2 mutation analysis from 71.1 to 94.0% (P < 0.0001). While specificity decreased, all screen-positive individuals were appropriate for cancer genetics referral, the primary purpose of the tool. Despite calls for BRCA1/2 population screening, there remains a critical need to identify those most at risk who should receive cancer genetics services. B-RST™ version 3.0 demonstrates high sensitivity for BRCA1/2 mutations, yet remains a simple and quick screening tool for at-risk individuals.

  4. Target Tracking Based Scene Analysis

    DTIC Science & Technology

    1984-08-01

    1082 , pp 377-391. [21 S.T. Barnard and M.A. Fisch~ler, "Computational Stereo", Computing Surveys 14, 1082 , pp 553-572. 131 K.H. Bers, M. Bohner, and P...Braunlage/Harz. FRG, June 21 - July 2, 1082 Springer, Berlin, 1083. pp 10.1-124. [81 R.B. Cate, T.*1B. Dennis, J.T. Mallin, K.S. Nedelman, NEIL Trenchard, and...Institute on Pictorial Data Analysis, Bonas, France, August 1-12, 1082 ), Springer, Berlin, 1983. [181 G.R. Legters Jr. and T.Y. Young, "A Mathematical

  5. Investigation of Intermediary Metabolism and Energy Exchange Following Human Trauma.

    DTIC Science & Technology

    1980-09-01

    there is a further rise in extracellular waLer. e) There is no observable effect of diet , during the first 4 postoperative days. b) Muscle and Plasma...nutrition Fig. 1 * Page 16. Kinney, John M. DA 49-193MD-2552 relating diet to skeletal muscle work performance has emphasized the beneficial effect of...mg/Kcal REE (high N) * Each diet given for 1 week, initial diet randomly assigned Prior to TPN all patients received 5% dextrose solution. Groupl 5

  6. Imagerie polarimétrique active : des applications militaires et duales

    NASA Astrophysics Data System (ADS)

    Goudail, François; Boffety, Matthieu; Leviandier, Luc; Vannier, Nicolas

    2017-12-01

    L'imagerie polarimétrique active permet de révéler des contrastes invisibles à l'oeil humain et aux caméras classiques. Elle peut étendre les capacités de décamouflage des systèmes d'imagerie active et améliorer la détection d'objets dangereux sur des pistes. Elle possède également de nombreuses applications duales dans des domaines tels que l'imagerie biomédicale ou la vision industrielle.

  7. Flameless Atomic Absorption Spectroscopy: Effects of Nitrates and Sulfates.

    DTIC Science & Technology

    1980-05-01

    ATTACHED DDJ~P 1413 EDITION 01 INO, 6 5 IabSoLEr J UjN!LbAa~ A- i SELU 0 IONOF I tG 651 J Flameless Atomic Absorption Spectroscopy: Effects of Nitrates...analytical techniques, flameless atomic absorption is subject to matrix or interference effects. Upon heating, nitrate and sulfate salts decompose to...Eklund and J.E. Smith, Anal Chem, 51, 1205 (1979) R.H. Eklund and J.A. Holcombe, Anal Chim. Acta, 109, 97 (1979) FLAMELESS ATOMIC ABSORPTION

  8. China: A Country Study

    DTIC Science & Technology

    1987-07-01

    into a nearby town to work in a small factory, open a noodle stand, or set up a machine repair business. Farmers, however, still could not legally...steamed bread and noodles . ler cal)i; cansum)tion has risen, and the demand for wheat flou r has IncTeasc I as inc(oncs have risen. Wheat has been f)v far...1975 state constitution was charac- terized by instant , arbitrary arrest. Impromptu trials were con- ducted either by a police officer on the spot, by a

  9. Non Linear Dynamics and Chaos (La Dynamique Non-Lineaie et le Chaos)

    DTIC Science & Technology

    1993-06-01

    mention some aspects of non-linear dynamics and/or disorder is independence. All other definitions are merely negative chaos which p,.Laps a,, margin- al ...collection of fixed coefficients and input-output corellation is U1, pp. 1037-1044, 1990, Roger Cerf et al . Among ,ie very slightly modified. If Rl is...des param~tres qui ou la deformation 61astique d’une partie du sol d~finissent le syst~me A contr~ler, et donne lieu, soudainement A un saut

  10. An Analysis of the INGRES Database Management System Applications Program Development Tools and Programming Environment

    DTIC Science & Technology

    1986-12-01

    Position cursor over the naBe of a report, then use the appropriate enu iteffl to perforn an operation on that report. Naae Owner RBF? Last changed...LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM A MODEL FOR AUTOMATIC FILE AND PROGRAM DESIGN IN BUSINE SS APPLICATION SYSTEM GENERALLY APPLICABLE...Article Description Year: 1988 Title: FLASH : A LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM Authors: ALLCHIN.J.E., KaLER.A.H., WIEDERHOL.D.G

  11. MARSnet: Mission-aware Autonomous Radar Sensor Network for Future Combat Systems

    DTIC Science & Technology

    2009-03-23

    15 , , L x(s)x*(s) ds WdB = lOlog Jo Y ; ; (19) In addition, the sidelobe structure is changed because of the Dop])ler shift...Synchro- nous CDMA Systems.’’ IEICE Trans.Fundamentals. E832A(11): 1 16, 2000. 24 [10] P. Z. Fan. "New Direction in Spreading Sequence Design and the...Suehiro , N Kuroyanagi and P Z Fan. "Two types of polyphase sequence set. for approximately synchronized CDMA systems." IEICE Trans. Fundamentals. E862A

  12. Understanding and Influencing Public Support for Insurgency and Terrorism

    DTIC Science & Technology

    2012-01-01

    and Speckhard, 2010; and Hamid, 2009. Hamid’s discussion is based on his own experiences growing up in Egypt, through his years in medical school...also highlights attractions such as duty, honor, and glory, but not earthly rewards (and not social services, such as medical care or education...Gençler başarmanın sözünü verdi” (“Youth Promised Success”), June 2008. Gündoğan, Azat Zana , “The Kurdish Political Mobilization in the 1960s: The Case

  13. Remotely Piloted Vehicle (RPV) Two versus Three Level Maintenance Support Concept Study.

    DTIC Science & Technology

    1988-01-15

    Abri:.ms ML-C, Technic:al Arid lysi!;&2jp7 f D~onnie Joyce Al ler Ad:va-.ncecd Sys.tems Coric epts oft ic.e, -,Je etaty Robo r t Bac-et RPV Pti...en ter, Al TN Conccept,-* & [h ct norii ’’ t Fort Lee, VA 2D501 ,c ient f ii: Advisor , ATIN: ATCI. SP(A, At my C eq 1 t mPFr [ pp Ft VA :27: C.1. Do

  14. Gadolinium Scandium Gallium Garnet (GSGG) as a Solid-State Laser Host

    DTIC Science & Technology

    1987-07-01

    o*SATI CODSi1.SBEC EM (otne nrvrs fnceayad dniy nb)k ubr ~~~~~~~~ Gadolinium Scandium Gallium Garnet (GSGG)asaSldtteLerHt 17. ABSTRACT 6.SUJCTTEM...certain other garnet materials for replacement. It also addresses the solid-state laser host material Gadolinium Scandium Gal- lium Garnet (GSGG) and its...by neodymium-doped yttrium aluminum garnet (Nd:YAG) or other mate- rials for most applications. In the years after the invention of the ruby laser, in

  15. Optoelectronic image processing for cervical cancer screening

    NASA Astrophysics Data System (ADS)

    Narayanswamy, Ramkumar; Sharpe, John P.; Johnson, Kristina M.

    1994-05-01

    Automation of the Pap-smear cervical screening method is highly desirable as it relieves tedium for the human operators, reduces cost and should increase accuracy and provide repeatability. We present here the design for a high-throughput optoelectronic system which forms the first stage of a two stage system to automate pap-smear screening. We use a mathematical morphological technique called the hit-or-miss transform to identify the suspicious areas on a pap-smear slide. This algorithm is implemented using a VanderLugt architecture and a time-sequential ANDing smart pixel array.

  16. Implementation Research to Inform the Use of Xpert MTB/RIF in Primary Health Care Facilities in High TB and HIV Settings in Resource Constrained Settings.

    PubMed

    Muyoyeta, Monde; Moyo, Maureen; Kasese, Nkatya; Ndhlovu, Mapopa; Milimo, Deborah; Mwanza, Winfridah; Kapata, Nathan; Schaap, Albertus; Godfrey Faussett, Peter; Ayles, Helen

    2015-01-01

    The current cost of Xpert MTB RIF (Xpert) consumables is such that algorithms are needed to select which patients to prioritise for testing with Xpert. To evaluate two algorithms for prioritisation of Xpert in primary health care settings in a high TB and HIV burden setting. Consecutive, presumptive TB patients with a cough of any duration were offered either Xpert or Fluorescence microscopy (FM) test depending on their CXR score or HIV status. In one facility, sputa from patients with an abnormal CXR were tested with Xpert and those with a normal CXR were tested with FM ("CXR algorithm"). CXR was scored automatically using a Computer Aided Diagnosis (CAD) program. In the other facility, patients who were HIV positive were tested using Xpert and those who were HIV negative were tested with FM ("HIV algorithm"). Of 9482 individuals pre-screened with CXR, Xpert detected TB in 2090/6568 (31.8%) with an abnormal CXR, and FM was AFB positive in 8/2455 (0.3%) with a normal CXR. Of 4444 pre-screened with HIV, Xpert detected TB in 508/2265 (22.4%) HIV positive and FM was AFB positive in 212/1920 (11.0%) in HIV negative individuals. The notification rate of new bacteriologically confirmed TB increased; from 366 to 620/ 100,000/yr and from 145 to 261/100,000/yr at the CXR and HIV algorithm sites respectively. The median time to starting TB treatment at the CXR site compared to the HIV algorithm site was; 1(IQR 1-3 days) and 3 (2-5 days) (p<0.0001) respectively. Use of Xpert in a resource-limited setting at primary care level in conjunction with pre-screening tests reduced the number of Xpert tests performed. The routine use of Xpert resulted in additional cases of confirmed TB patients starting treatment. However, there was no increase in absolute numbers of patients starting TB treatment. Same day diagnosis and treatment commencement was achieved for both bacteriologically confirmed and empirically diagnosed patients where Xpert was used in conjunction with CXR.

  17. Detection of critical congenital heart defects: Review of contributions from prenatal and newborn screening

    PubMed Central

    Olney, Richard S.; Ailes, Elizabeth C.; Sontag, Marci K.

    2015-01-01

    In 2011, statewide newborn screening programs for critical congenital heart defects began in the United States, and subsequently screening has been implemented widely. In this review, we focus on data reports and collection efforts related to both prenatal diagnosis and newborn screening. Defect-specific, maternal, and geographic factors are associated with variations in prenatal detection, so newborn screening provides a population-wide safety net for early diagnosis. A new web-based repository is collecting information on newborn screening program policies, quality indicators related to screening programs, and specific case-level data on infants with these defects. Birth defects surveillance programs also collect data about critical congenital heart defects, particularly related to diagnostic timing, mortality, and services. Individuals from state programs, federal agencies, and national organizations will be interested in these data to further refine algorithms for screening in normal newborn nurseries, neonatal intensive care settings, and other special populations; and ultimately to evaluate the impact of screening on outcomes. PMID:25979782

  18. Detection of critical congenital heart defects: Review of contributions from prenatal and newborn screening.

    PubMed

    Olney, Richard S; Ailes, Elizabeth C; Sontag, Marci K

    2015-04-01

    In 2011, statewide newborn screening programs for critical congenital heart defects began in the United States, and subsequently screening has been implemented widely. In this review, we focus on data reports and collection efforts related to both prenatal diagnosis and newborn screening. Defect-specific, maternal, and geographic factors are associated with variations in prenatal detection, so newborn screening provides a population-wide safety net for early diagnosis. A new web-based repository is collecting information on newborn screening program policies, quality indicators related to screening programs, and specific case-level data on infants with these defects. Birth defects surveillance programs also collect data about critical congenital heart defects, particularly related to diagnostic timing, mortality, and services. Individuals from state programs, federal agencies, and national organizations will be interested in these data to further refine algorithms for screening in normal newborn nurseries, neonatal intensive care settings, and other special populations; and ultimately to evaluate the impact of screening on outcomes. Published by Elsevier Inc.

  19. Neurodevelopmental outcomes in children with congenital heart disease: evaluation and management: a scientific statement from the American Heart Association.

    PubMed

    Marino, Bradley S; Lipkin, Paul H; Newburger, Jane W; Peacock, Georgina; Gerdes, Marsha; Gaynor, J William; Mussatto, Kathleen A; Uzark, Karen; Goldberg, Caren S; Johnson, Walter H; Li, Jennifer; Smith, Sabrina E; Bellinger, David C; Mahle, William T

    2012-08-28

    The goal of this statement was to review the available literature on surveillance, screening, evaluation, and management strategies and put forward a scientific statement that would comprehensively review the literature and create recommendations to optimize neurodevelopmental outcome in the pediatric congenital heart disease (CHD) population. A writing group appointed by the American Heart Association and American Academy of Pediatrics reviewed the available literature addressing developmental disorder and disability and developmental delay in the CHD population, with specific attention given to surveillance, screening, evaluation, and management strategies. MEDLINE and Google Scholar database searches from 1966 to 2011 were performed for English-language articles cross-referencing CHD with pertinent search terms. The reference lists of identified articles were also searched. The American College of Cardiology/American Heart Association classification of recommendations and levels of evidence for practice guidelines were used. A management algorithm was devised that stratified children with CHD on the basis of established risk factors. For those deemed to be at high risk for developmental disorder or disabilities or for developmental delay, formal, periodic developmental and medical evaluations are recommended. A CHD algorithm for surveillance, screening, evaluation, reevaluation, and management of developmental disorder or disability has been constructed to serve as a supplement to the 2006 American Academy of Pediatrics statement on developmental surveillance and screening. The proposed algorithm is designed to be carried out within the context of the medical home. This scientific statement is meant for medical providers within the medical home who care for patients with CHD. Children with CHD are at increased risk of developmental disorder or disabilities or developmental delay. Periodic developmental surveillance, screening, evaluation, and reevaluation throughout childhood may enhance identification of significant deficits, allowing for appropriate therapies and education to enhance later academic, behavioral, psychosocial, and adaptive functioning.

  20. Urethral lymphogranuloma venereum infections in men with anorectal lymphogranuloma venereum and their partners: the missing link in the current epidemic?

    PubMed

    de Vrieze, Nynke Hesselina Neeltje; van Rooijen, Martijn; Speksnijder, Arjen Gerard Cornelis Lambertus; de Vries, Henry John C

    2013-08-01

    Urethral lymphogranuloma venereum (LGV) is not screened routinely. We found that in 341 men having sex with men with anorectal LGV, 7 (2.1%) had concurrent urethral LGV. Among 59 partners, 4 (6.8%) had urethral LGV infections. Urethral LGV is common, probably key in transmission, and missed in current routine LGV screening algorithms.

  1. A parallel algorithm for the initial screening of space debris collisions prediction using the SGP4/SDP4 models and GPU acceleration

    NASA Astrophysics Data System (ADS)

    Lin, Mingpei; Xu, Ming; Fu, Xiaoyu

    2017-05-01

    Currently, a tremendous amount of space debris in Earth's orbit imperils operational spacecraft. It is essential to undertake risk assessments of collisions and predict dangerous encounters in space. However, collision predictions for an enormous amount of space debris give rise to large-scale computations. In this paper, a parallel algorithm is established on the Compute Unified Device Architecture (CUDA) platform of NVIDIA Corporation for collision prediction. According to the parallel structure of NVIDIA graphics processors, a block decomposition strategy is adopted in the algorithm. Space debris is divided into batches, and the computation and data transfer operations of adjacent batches overlap. As a consequence, the latency to access shared memory during the entire computing process is significantly reduced, and a higher computing speed is reached. Theoretically, a simulation of collision prediction for space debris of any amount and for any time span can be executed. To verify this algorithm, a simulation example including 1382 pieces of debris, whose operational time scales vary from 1 min to 3 days, is conducted on Tesla C2075 of NVIDIA. The simulation results demonstrate that with the same computational accuracy as that of a CPU, the computing speed of the parallel algorithm on a GPU is 30 times that on a CPU. Based on this algorithm, collision prediction of over 150 Chinese spacecraft for a time span of 3 days can be completed in less than 3 h on a single computer, which meets the timeliness requirement of the initial screening task. Furthermore, the algorithm can be adapted for multiple tasks, including particle filtration, constellation design, and Monte-Carlo simulation of an orbital computation.

  2. Segmentation of the whole breast from low-dose chest CT images

    NASA Astrophysics Data System (ADS)

    Liu, Shuang; Salvatore, Mary; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2015-03-01

    The segmentation of whole breast serves as the first step towards automated breast lesion detection. It is also necessary for automatically assessing the breast density, which is considered to be an important risk factor for breast cancer. In this paper we present a fully automated algorithm to segment the whole breast in low-dose chest CT images (LDCT), which has been recommended as an annual lung cancer screening test. The automated whole breast segmentation and potential breast density readings as well as lesion detection in LDCT will provide useful information for women who have received LDCT screening, especially the ones who have not undergone mammographic screening, by providing them additional risk indicators for breast cancer with no additional radiation exposure. The two main challenges to be addressed are significant range of variations in terms of the shape and location of the breast in LDCT and the separation of pectoral muscles from the glandular tissues. The presented algorithm achieves robust whole breast segmentation using an anatomy directed rule-based method. The evaluation is performed on 20 LDCT scans by comparing the segmentation with ground truth manually annotated by a radiologist on one axial slice and two sagittal slices for each scan. The resulting average Dice coefficient is 0.880 with a standard deviation of 0.058, demonstrating that the automated segmentation algorithm achieves results consistent with manual annotations of a radiologist.

  3. Sensitivity and specificity of the subcutaneous implantable cardioverter defibrillator pre-implant screening tool.

    PubMed

    Zeb, Mehmood; Curzen, Nick; Allavatam, Venugopal; Wilson, David; Yue, Arthur; Roberts, Paul; Morgan, John

    2015-09-15

    The sensitivity and specificity of the subcutaneous implantable cardioverter defibrillator (S-ICD) pre-implant screening tool required clinical evaluation. Bipolar vectors were derived from electrodes positioned at locations similar to those employed for S-ICD sensing and pre-implant screening electrodes, and recordings collected through 80-electrode PRIME®-ECGs, in six different postures, from 40 subjects (10 healthy controls, and 30 patients with complex congenital heart disease (CCHD); 10 with Tetralogy of Fallot (TOF), 10 with single ventricle physiology (SVP), and 10 with transposition of great arteries (TGA)). The resulting vectors were analysed using the S-ICD pre-implant screening tool (Boston Scientific) and processed through the sensing algorithm of S-ICD (Boston Scientific). The data were then evaluated using 2 × 2 contingency tables. Fisher exact and McNemar tests were used for a comparison of the different categories of CCHD, and p < 0.05 vs. controls considered to be statistically significant. 57% of patients were male, mean age of 36.3 years. The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of the S-ICD screening tool were 95%, 79%, 59% and 98%, respectively, for controls, and 84%, 79%, 76% and 86%, respectively, in patients with CCHD (p = 0.0001). The S-ICD screening tool was comparatively more sensitive in normal controls but less specific in both CCHD patients and controls; a possible explanation for the reported high incidence of inappropriate S-ICD shocks. Thus, we propose a pre-implant screening device using the S-ICD sensing algorithm to minimise false exclusion and selection, and hence minimise potentially inappropriate shocks. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Efficient Screening of Climate Model Sensitivity to a Large Number of Perturbed Input Parameters [plus supporting information

    DOE PAGES

    Covey, Curt; Lucas, Donald D.; Tannahill, John; ...

    2013-07-01

    Modern climate models contain numerous input parameters, each with a range of possible values. Since the volume of parameter space increases exponentially with the number of parameters N, it is generally impossible to directly evaluate a model throughout this space even if just 2-3 values are chosen for each parameter. Sensitivity screening algorithms, however, can identify input parameters having relatively little effect on a variety of output fields, either individually or in nonlinear combination.This can aid both model development and the uncertainty quantification (UQ) process. Here we report results from a parameter sensitivity screening algorithm hitherto untested in climate modeling,more » the Morris one-at-a-time (MOAT) method. This algorithm drastically reduces the computational cost of estimating sensitivities in a high dimensional parameter space because the sample size grows linearly rather than exponentially with N. It nevertheless samples over much of the N-dimensional volume and allows assessment of parameter interactions, unlike traditional elementary one-at-a-time (EOAT) parameter variation. We applied both EOAT and MOAT to the Community Atmosphere Model (CAM), assessing CAM’s behavior as a function of 27 uncertain input parameters related to the boundary layer, clouds, and other subgrid scale processes. For radiation balance at the top of the atmosphere, EOAT and MOAT rank most input parameters similarly, but MOAT identifies a sensitivity that EOAT underplays for two convection parameters that operate nonlinearly in the model. MOAT’s ranking of input parameters is robust to modest algorithmic variations, and it is qualitatively consistent with model development experience. Supporting information is also provided at the end of the full text of the article.« less

  5. Postnatal gestational age estimation using newborn screening blood spots: a proposed validation protocol

    PubMed Central

    Murphy, Malia S Q; Hawken, Steven; Atkinson, Katherine M; Milburn, Jennifer; Pervin, Jesmin; Gravett, Courtney; Stringer, Jeffrey S A; Rahman, Anisur; Lackritz, Eve; Chakraborty, Pranesh; Wilson, Kumanan

    2017-01-01

    Background Knowledge of gestational age (GA) is critical for guiding neonatal care and quantifying regional burdens of preterm birth. In settings where access to ultrasound dating is limited, postnatal estimates are frequently used despite the issues of accuracy associated with postnatal approaches. Newborn metabolic profiles are known to vary by severity of preterm birth. Recent work by our group and others has highlighted the accuracy of postnatal GA estimation algorithms derived from routinely collected newborn screening profiles. This protocol outlines the validation of a GA model originally developed in a North American cohort among international newborn cohorts. Methods Our primary objective is to use blood spot samples collected from infants born in Zambia and Bangladesh to evaluate our algorithm’s capacity to correctly classify GA within 1, 2, 3 and 4 weeks. Secondary objectives are to 1) determine the algorithm's accuracy in small-for-gestational-age and large-for-gestational-age infants, 2) determine its ability to correctly discriminate GA of newborns across dichotomous thresholds of preterm birth (≤34 weeks, <37 weeks GA) and 3) compare the relative performance of algorithms derived from newborn screening panels including all available analytes and those restricted to analyte subsets. The study population will consist of infants born to mothers already enrolled in one of two preterm birth cohorts in Lusaka, Zambia, and Matlab, Bangladesh. Dried blood spot samples will be collected and sent for analysis in Ontario, Canada, for model validation. Discussion This study will determine the validity of a GA estimation algorithm across ethnically diverse infant populations and assess population specific variations in newborn metabolic profiles. PMID:29104765

  6. Effects of Iterative Reconstruction Algorithms on Computer-assisted Detection (CAD) Software for Lung Nodules in Ultra-low-dose CT for Lung Cancer Screening.

    PubMed

    Nomura, Yukihiro; Higaki, Toru; Fujita, Masayo; Miki, Soichiro; Awaya, Yoshikazu; Nakanishi, Toshio; Yoshikawa, Takeharu; Hayashi, Naoto; Awai, Kazuo

    2017-02-01

    This study aimed to evaluate the effects of iterative reconstruction (IR) algorithms on computer-assisted detection (CAD) software for lung nodules in ultra-low-dose computed tomography (ULD-CT) for lung cancer screening. We selected 85 subjects who underwent both a low-dose CT (LD-CT) scan and an additional ULD-CT scan in our lung cancer screening program for high-risk populations. The LD-CT scans were reconstructed with filtered back projection (FBP; LD-FBP). The ULD-CT scans were reconstructed with FBP (ULD-FBP), adaptive iterative dose reduction 3D (AIDR 3D; ULD-AIDR 3D), and forward projected model-based IR solution (FIRST; ULD-FIRST). CAD software for lung nodules was applied to each image dataset, and the performance of the CAD software was compared among the different IR algorithms. The mean volume CT dose indexes were 3.02 mGy (LD-CT) and 0.30 mGy (ULD-CT). For overall nodules, the sensitivities of CAD software at 3.0 false positives per case were 78.7% (LD-FBP), 9.3% (ULD-FBP), 69.4% (ULD-AIDR 3D), and 77.8% (ULD-FIRST). Statistical analysis showed that the sensitivities of ULD-AIDR 3D and ULD-FIRST were significantly higher than that of ULD-FBP (P < .001). The performance of CAD software in ULD-CT was improved by using IR algorithms. In particular, the performance of CAD in ULD-FIRST was almost equivalent to that in LD-FBP. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  7. An evidence-based treatment algorithm for colorectal polyp cancers: results from the Scottish Screen-detected Polyp Cancer Study (SSPoCS).

    PubMed

    Richards, C H; Ventham, N T; Mansouri, D; Wilson, M; Ramsay, G; Mackay, C D; Parnaby, C N; Smith, D; On, J; Speake, D; McFarlane, G; Neo, Y N; Aitken, E; Forrest, C; Knight, K; McKay, A; Nair, H; Mulholland, C; Robertson, J H; Carey, F A; Steele, Rjc

    2018-02-01

    Colorectal polyp cancers present clinicians with a treatment dilemma. Decisions regarding whether to offer segmental resection or endoscopic surveillance are often taken without reference to good quality evidence. The aim of this study was to develop a treatment algorithm for patients with screen-detected polyp cancers. This national cohort study included all patients with a polyp cancer identified through the Scottish Bowel Screening Programme between 2000 and 2012. Multivariate regression analysis was used to assess the impact of clinical, endoscopic and pathological variables on the rate of adverse events (residual tumour in patients undergoing segmental resection or cancer-related death or disease recurrence in any patient). These data were used to develop a clinically relevant treatment algorithm. 485 patients with polyp cancers were included. 186/485 (38%) underwent segmental resection and residual tumour was identified in 41/186 (22%). The only factor associated with an increased risk of residual tumour in the bowel wall was incomplete excision of the original polyp (OR 5.61, p=0.001), while only lymphovascular invasion was associated with an increased risk of lymph node metastases (OR 5.95, p=0.002). When patients undergoing segmental resection or endoscopic surveillance were considered together, the risk of adverse events was significantly higher in patients with incomplete excision (OR 10.23, p<0.001) or lymphovascular invasion (OR 2.65, p=0.023). A policy of surveillance is adequate for the majority of patients with screen-detected colorectal polyp cancers. Consideration of segmental resection should be reserved for those with incomplete excision or evidence of lymphovascular invasion. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    PubMed Central

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  9. Automated high-performance cIMT measurement techniques using patented AtheroEdge™: a screening and home monitoring system.

    PubMed

    Molinari, Filippo; Meiburger, Kristen M; Suri, Jasjit

    2011-01-01

    The evaluation of the carotid artery wall is fundamental for the assessment of cardiovascular risk. This paper presents the general architecture of an automatic strategy, which segments the lumen-intima and media-adventitia borders, classified under a class of Patented AtheroEdge™ systems (Global Biomedical Technologies, Inc, CA, USA). Guidelines to produce accurate and repeatable measurements of the intima-media thickness are provided and the problem of the different distance metrics one can adopt is confronted. We compared the results of a completely automatic algorithm that we developed with those of a semi-automatic algorithm, and showed final segmentation results for both techniques. The overall rationale is to provide user-independent high-performance techniques suitable for screening and remote monitoring.

  10. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    NASA Astrophysics Data System (ADS)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  11. Quantum Monte Carlo Simulation of Frustrated Kondo Lattice Models

    NASA Astrophysics Data System (ADS)

    Sato, Toshihiro; Assaad, Fakher F.; Grover, Tarun

    2018-03-01

    The absence of the negative sign problem in quantum Monte Carlo simulations of spin and fermion systems has different origins. World-line based algorithms for spins require positivity of matrix elements whereas auxiliary field approaches for fermions depend on symmetries such as particle-hole symmetry. For negative-sign-free spin and fermionic systems, we show that one can formulate a negative-sign-free auxiliary field quantum Monte Carlo algorithm that allows Kondo coupling of fermions with the spins. Using this general approach, we study a half-filled Kondo lattice model on the honeycomb lattice with geometric frustration. In addition to the conventional Kondo insulator and antiferromagnetically ordered phases, we find a partial Kondo screened state where spins are selectively screened so as to alleviate frustration, and the lattice rotation symmetry is broken nematically.

  12. Automated recognition of microcalcification clusters in mammograms

    NASA Astrophysics Data System (ADS)

    Bankman, Isaac N.; Christens-Barry, William A.; Kim, Dong W.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.

    1993-07-01

    The widespread and increasing use of mammographic screening for early breast cancer detection is placing a significant strain on clinical radiologists. Large numbers of radiographic films have to be visually interpreted in fine detail to determine the subtle hallmarks of cancer that may be present. We developed an algorithm for detecting microcalcification clusters, the most common and useful signs of early, potentially curable breast cancer. We describe this algorithm, which utilizes contour map representations of digitized mammographic films, and discuss its benefits in overcoming difficulties often encountered in algorithmic approaches to radiographic image processing. We present experimental analyses of mammographic films employing this contour-based algorithm and discuss practical issues relevant to its use in an automated film interpretation instrument.

  13. Using a web-based nutrition algorithm in hemodialysis patients.

    PubMed

    Steiber, Alison L; León, Janeen B; Hand, Rosa K; Murphy, William J; Fouque, Denis; Parrott, J Scott; Kalantar-Zadeh, Kamyar; Cuppari, Lilian

    2015-01-01

    The purpose of this study was to test the ability of a newly developed nutrition algorithm on (1) clinical utility and (2) ability to capture patient outcomes. This was a prospective observational study, using a practice based research network structure, involving renal dietitians and hemodialysis [HD] patients. This study took place in HD outpatient units in five different countries. Hundred chronic HD patients were included in this study. To select subjects, dietitians screened and consented patients in their facilities until 4 patients "at nutrition risk" based on the algorithm screening tool were identified. Inclusion criteria were patients aged older than 19 years, not on hospice or equivalent, able to read the informed consent and ask questions, and receiving HD. The ability of the algorithm screening tool is to identify patients at nutrition risk, to guide clinicians in logical renal-modified nutrition care process chains including follow-up on relevant parameters, and capture change in outcomes over 3 months. Statistics were performed using SPSS version 20.0 and significance was set at P < .05. One hundred patients on HD, enrolled by 29 dietitians, were included in this analysis. The average number of out-of-range screening parameters per patient was 3.7 (standard deviation 1.5, range 1-7), and the most prevalent risk factors were elevated parathyroid hormone (PTH; 62.8%) and low serum cholesterol (56.5%). At the initial screening step, 8 of the 14 factors led to chains with nonrandom selection patterns (by χ(2) test with P < .05). In the subsequent diagnosis step, patients diagnosed within the insufficient protein group (n = 38), increased protein intake by 0.11 g/kg/day (P = .022). In patients with a diagnosis in the high PTH group, PTH decreased by a mean of 176.85 pg/mL (n = 19, P = .011) and in those with a diagnosis in the high phosphorous group, serum phosphorous decreased by a mean of 0.91 mg/dL (n = 33, P = .006). Finally, the relative likelihood of each assessment being completed after making the related diagnosis at the previous visit compared with those for whom that diagnosis was not made was assessed, including the likelihood of a patient's protein intake assessed after a diagnosis in the insufficient protein group was made (odds ratio = 4.08, P < .05). This study demonstrates the clinical utility of a web-based HD-specific nutrition algorithm, including the ability to track changes in outcomes over time. There is potential for future research to use this tool and investigate the comparative impact of nutrition interventions. Copyright © 2015 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  14. Capacity is the Wrong Paradigm

    DTIC Science & Technology

    2002-01-01

    short, steganography values detection over ro- bustness, whereas watermarking values robustness over de - tection.) Hiding techniques for JPEG images ...world length of the code. D: If the algorithm is known, this method is trivially de - tectable if we are sending images (with no encryption). If we are...implications of the work of Chaitin and Kolmogorov on algorithmic complex- ity [5]. We have also concentrated on screen images in this paper and have not

  15. Halftoning and Image Processing Algorithms

    DTIC Science & Technology

    1999-02-01

    screening techniques with the quality advantages of error diffusion in the half toning of color maps, and on color image enhancement for halftone ...image quality. Our goals in this research were to advance the understanding in image science for our new halftone algorithm and to contribute to...image retrieval and noise theory for such imagery. In the field of color halftone printing, research was conducted on deriving a theoretical model of our

  16. [Quality of documentation of intraoperative and postoperative complications : improvement of documentation for a nationwide quality assurance program and comparison with routine data].

    PubMed

    Jakob, J; Marenda, D; Sold, M; Schlüter, M; Post, S; Kienle, P

    2014-08-01

    Complications after cholecystectomy are continuously documented in a nationwide database in Germany. Recent studies demonstrated a lack of reliability of these data. The aim of the study was to evaluate the impact of a control algorithm on documentation quality and the use of routine diagnosis coding as an additional validation instrument. Completeness and correctness of the documentation of complications after cholecystectomy was compared over a time interval of 12 months before and after implementation of an algorithm for faster and more accurate documentation. Furthermore, the coding of all diagnoses was screened to identify intraoperative and postoperative complications. The sensitivity of the documentation for complications improved from 46 % to 70 % (p = 0.05, specificity 98 % in both time intervals). A prolonged time interval of more than 6 weeks between patient discharge and documentation was associated with inferior data quality (incorrect documentation in 1.5 % versus 15 %, p < 0.05). The rate of case documentation within the 6 weeks after hospital discharge was clearly improved after implementation of the control algorithm. Sensitivity and specificity of screening for complications by evaluating routine diagnoses coding were 70 % and 85 %, respectively. The quality of documentation was improved by implementation of a simple memory algorithm.

  17. gWEGA: GPU-accelerated WEGA for molecular superposition and shape comparison.

    PubMed

    Yan, Xin; Li, Jiabo; Gu, Qiong; Xu, Jun

    2014-06-05

    Virtual screening of a large chemical library for drug lead identification requires searching/superimposing a large number of three-dimensional (3D) chemical structures. This article reports a graphic processing unit (GPU)-accelerated weighted Gaussian algorithm (gWEGA) that expedites shape or shape-feature similarity score-based virtual screening. With 86 GPU nodes (each node has one GPU card), gWEGA can screen 110 million conformations derived from an entire ZINC drug-like database with diverse antidiabetic agents as query structures within 2 s (i.e., screening more than 55 million conformations per second). The rapid screening speed was accomplished through the massive parallelization on multiple GPU nodes and rapid prescreening of 3D structures (based on their shape descriptors and pharmacophore feature compositions). Copyright © 2014 Wiley Periodicals, Inc.

  18. Inference on cancer screening exam accuracy using population-level administrative data.

    PubMed

    Jiang, H; Brown, P E; Walter, S D

    2016-01-15

    This paper develops a model for cancer screening and cancer incidence data, accommodating the partially unobserved disease status, clustered data structures, general covariate effects, and dependence between exams. The true unobserved cancer and detection status of screening participants are treated as latent variables, and a Markov Chain Monte Carlo algorithm is used to estimate the Bayesian posterior distributions of the diagnostic error rates and disease prevalence. We show how the Bayesian approach can be used to draw inferences about screening exam properties and disease prevalence while allowing for the possibility of conditional dependence between two exams. The techniques are applied to the estimation of the diagnostic accuracy of mammography and clinical breast examination using data from the Ontario Breast Screening Program in Canada. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Applying Image Matching to Video Analysis

    DTIC Science & Technology

    2010-09-01

    image groups, classified by the background scene, are the flag, the kitchen, the telephone, the bookshelf , the title screen, the...Kitchen 136 Telephone 3 Bookshelf 81 Title Screen 10 Map 1 24 Map 2 16 command line. This implementation of a Bloom filter uses two arbitrary...with the Bookshelf images. This scene is a much closer shot than the Kitchen scene so the host occupies much of the background. Algorithms for face

  20. Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments

    PubMed Central

    Shockley, Keith R.

    2014-01-01

    Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003

  1. Effects of reduced nitrogen inputs on crop yield and nitrogen use efficiency in a long-term maize-soybean relay strip intercropping system.

    PubMed

    Chen, Ping; Du, Qing; Liu, Xiaoming; Zhou, Li; Hussain, Sajad; Lei, Lu; Song, Chun; Wang, Xiaochun; Liu, Weiguo; Yang, Feng; Shu, Kai; Liu, Jiang; Du, Junbo; Yang, Wenyu; Yong, Taiwen

    2017-01-01

    The blind pursuit of high yields via increased fertilizer inputs increases the environmental costs. Relay intercropping has advantages for yield, but a strategy for N management is urgently required to decrease N inputs without yield loss in maize-soybean relay intercropping systems (IMS). Experiments were conducted with three levels of N and three planting patterns, and dry matter accumulation, nitrogen uptake, nitrogen use efficiency (NUE), competition ratio (CR), system productivity index (SPI), land equivalent ratio (LER), and crop root distribution were investigated. Our results showed that the CR of soybean was greater than 1, and that the change in root distribution in space and time resulted in an interspecific facilitation in IMS. The maximum yield of maize under monoculture maize (MM) occurred with conventional nitrogen (CN), whereas under IMS, the maximum yield occurred with reduced nitrogen (RN). The yield of monoculture soybean (MS) and of soybean in IMS both reached a maximum under RN. The LER of IMS varied from 1.85 to 2.36, and the SPI peaked under RN. Additionally, the NUE of IMS increased by 103.7% under RN compared with that under CN. In conclusion, the separation of the root ecological niche contributed to a positive interspecific facilitation, which increased the land productivity. Thus, maize-soybean relay intercropping with reduced N input provides a very useful approach to increase land productivity and avert environmental pollution.

  2. Nuclear power plant Generic Aging Lessons Learned (GALL). Main report and appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaza, K.E.; Diercks, D.R.; Holland, J.W.

    The purpose of this generic aging lessons learned (GALL) review is to provide a systematic review of plant aging information in order to assess materials and component aging issues related to continued operation and license renewal of operating reactors. Literature on mechanical, structural, and thermal-hydraulic components and systems reviewed consisted of 97 Nuclear Plant Aging Research (NPAR) reports, 23 NRC Generic Letters, 154 Information Notices, 29 Licensee Event Reports (LERs), 4 Bulletins, and 9 Nuclear Management and Resources Council Industry Reports (NUMARC IRs) and literature on electrical components and systems reviewed consisted of 66 NPAR reports, 8 NRC Generic Letters,more » 111 Information Notices, 53 LERs, 1 Bulletin, and 1 NUMARC IR. More than 550 documents were reviewed. The results of these reviews were systematized using a standardized GALL tabular format and standardized definitions of aging-related degradation mechanisms and effects. The tables are included in volume s 1 and 2 of this report. A computerized data base has also been developed for all review tables and can be used to expedite the search for desired information on structures, components, and relevant aging effects. A survey of the GALL tables reveals that all ongoing significant component aging issues are currently being addressed by the regulatory process. However, the aging of what are termed passive components has been highlighted for continued scrutiny. This document is Volume 1, consisting of the executive summary, summary and observations, and an appendix listing the GALL literature review tables.« less

  3. Differentially expressed genes during the imbibition of dormant and after-ripened seeds - a reverse genetics approach.

    PubMed

    Yazdanpanah, Farzaneh; Hanson, Johannes; Hilhorst, Henk W M; Bentsink, Leónie

    2017-09-11

    Seed dormancy, defined as the incapability of a viable seed to germinate under favourable conditions, is an important trait in nature and agriculture. Despite extensive research on dormancy and germination, many questions about the molecular mechanisms controlling these traits remain unanswered, likely due to its genetic complexity and the large environmental effects which are characteristic of these quantitative traits. To boost research towards revealing mechanisms in the control of seed dormancy and germination we depend on the identification of genes controlling those traits. We used transcriptome analysis combined with a reverse genetics approach to identify genes that are prominent for dormancy maintenance and germination in imbibed seeds of Arabidopsis thaliana. Comparative transcriptomics analysis was employed on freshly harvested (dormant) and after-ripened (AR; non-dormant) 24-h imbibed seeds of four different DELAY OF GERMINATION near isogenic lines (DOGNILs) and the Landsberg erecta (Ler) wild type with varying levels of primary dormancy. T-DNA knock-out lines of the identified genes were phenotypically investigated for their effect on dormancy and AR. We identified conserved sets of 46 and 25 genes which displayed higher expression in seeds of all dormant and all after-ripened DOGNILs and Ler, respectively. Knock-out mutants in these genes showed dormancy and germination related phenotypes. Most of the identified genes had not been implicated in seed dormancy or germination. This research will be useful to further decipher the molecular mechanisms by which these important ecological and commercial traits are regulated.

  4. Detection of pseudosinusoidal epileptic seizure segments in the neonatal EEG by cascading a rule-based algorithm with a neural network.

    PubMed

    Karayiannis, Nicolaos B; Mukherjee, Amit; Glover, John R; Ktonas, Periklis Y; Frost, James D; Hrachovy, Richard A; Mizrahi, Eli M

    2006-04-01

    This paper presents an approach to detect epileptic seizure segments in the neonatal electroencephalogram (EEG) by characterizing the spectral features of the EEG waveform using a rule-based algorithm cascaded with a neural network. A rule-based algorithm screens out short segments of pseudosinusoidal EEG patterns as epileptic based on features in the power spectrum. The output of the rule-based algorithm is used to train and compare the performance of conventional feedforward neural networks and quantum neural networks. The results indicate that the trained neural networks, cascaded with the rule-based algorithm, improved the performance of the rule-based algorithm acting by itself. The evaluation of the proposed cascaded scheme for the detection of pseudosinusoidal seizure segments reveals its potential as a building block of the automated seizure detection system under development.

  5. MLViS: A Web Tool for Machine Learning-Based Virtual Screening in Early-Phase of Drug Discovery and Development

    PubMed Central

    Korkmaz, Selcuk; Zararsiz, Gokmen; Goksuluk, Dincer

    2015-01-01

    Virtual screening is an important step in early-phase of drug discovery process. Since there are thousands of compounds, this step should be both fast and effective in order to distinguish drug-like and nondrug-like molecules. Statistical machine learning methods are widely used in drug discovery studies for classification purpose. Here, we aim to develop a new tool, which can classify molecules as drug-like and nondrug-like based on various machine learning methods, including discriminant, tree-based, kernel-based, ensemble and other algorithms. To construct this tool, first, performances of twenty-three different machine learning algorithms are compared by ten different measures, then, ten best performing algorithms have been selected based on principal component and hierarchical cluster analysis results. Besides classification, this application has also ability to create heat map and dendrogram for visual inspection of the molecules through hierarchical cluster analysis. Moreover, users can connect the PubChem database to download molecular information and to create two-dimensional structures of compounds. This application is freely available through www.biosoft.hacettepe.edu.tr/MLViS/. PMID:25928885

  6. A simple biota removal algorithm for 35 GHz cloud radar measurements

    NASA Astrophysics Data System (ADS)

    Kalapureddy, Madhu Chandra R.; Sukanya, Patra; Das, Subrata K.; Deshpande, Sachin M.; Pandithurai, Govindan; Pazamany, Andrew L.; Ambuj K., Jha; Chakravarty, Kaustav; Kalekar, Prasad; Krishna Devisetty, Hari; Annam, Sreenivas

    2018-03-01

    Cloud radar reflectivity profiles can be an important measurement for the investigation of cloud vertical structure (CVS). However, extracting intended meteorological cloud content from the measurement often demands an effective technique or algorithm that can reduce error and observational uncertainties in the recorded data. In this work, a technique is proposed to identify and separate cloud and non-hydrometeor echoes using the radar Doppler spectral moments profile measurements. The point and volume target-based theoretical radar sensitivity curves are used for removing the receiver noise floor and identified radar echoes are scrutinized according to the signal decorrelation period. Here, it is hypothesized that cloud echoes are observed to be temporally more coherent and homogenous and have a longer correlation period than biota. That can be checked statistically using ˜ 4 s sliding mean and standard deviation value of reflectivity profiles. The above step helps in screen out clouds critically by filtering out the biota. The final important step strives for the retrieval of cloud height. The proposed algorithm potentially identifies cloud height solely through the systematic characterization of Z variability using the local atmospheric vertical structure knowledge besides to the theoretical, statistical and echo tracing tools. Thus, characterization of high-resolution cloud radar reflectivity profile measurements has been done with the theoretical echo sensitivity curves and observed echo statistics for the true cloud height tracking (TEST). TEST showed superior performance in screening out clouds and filtering out isolated insects. TEST constrained with polarimetric measurements was found to be more promising under high-density biota whereas TEST combined with linear depolarization ratio and spectral width perform potentially to filter out biota within the highly turbulent shallow cumulus clouds in the convective boundary layer (CBL). This TEST technique is promisingly simple in realization but powerful in performance due to the flexibility in constraining, identifying and filtering out the biota and screening out the true cloud content, especially the CBL clouds. Therefore, the TEST algorithm is superior for screening out the low-level clouds that are strongly linked to the rainmaking mechanism associated with the Indian Summer Monsoon region's CVS.

  7. Automated System for Early Breast Cancer Detection in Mammograms

    NASA Technical Reports Server (NTRS)

    Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.

    1993-01-01

    The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.

  8. Characterization of a conical null-screen corneal topographer

    NASA Astrophysics Data System (ADS)

    Osorio-Infante, Arturo I.; Campos-García, Manuel; Cossio-Guerrero, Cesar

    2017-06-01

    In this work, we perform the characterization of a conical null-screen corneal topographer. For this, we design a custom null-screens for testing a reference spherical surfaces with a radius of curvature of 7.8 mm. We also test a 1/2-inch (12.7 mm) diameter stainless steel sphere and an aspherical surface with a radius of curvature of 7.77 mm. We designed some different target distributions with the same target size to evaluate the shape of the reference surfaces. The shape of each surface was recovered by fitting the experimental data to a custom shape using the least square methods with an iterative algorithm. The target distributions were modified to improve the accuracy of the measurements. We selected a distribution and evaluate the accuracy of the algorithms to measure spherical surfaces with a radius of curvature from 6 mm to 8.2 mm by simulating the reflected pattern. We also simulate the reflected patter by changing the position of the surface along the optical axis and then we measure the resulting radius of curvature.

  9. Application of matrix-assisted laser desorption ionization time-of-flight mass spectrometry in the screening of vanA-positive Enterococcus faecium.

    PubMed

    Wang, Li-jun; Lu, Xin-xin; Wu, Wei; Sui, Wen-jun; Zhang, Gui

    2014-01-01

    In order to evaluate a rapid matrix-assisted laser desorption ionization-time of flight mass spectrometry (MAIDI-TOF MS) assay in screening vancomycin-resistant Enterococcus faecium, a total of 150 E. faecium clinical strains were studied, including 60 vancomycin-resistant E. faecium (VREF) isolates and 90 vancomycin-susceptible (VSEF) strains. Vancomycin resistance genes were detected by sequencing. E. faecium were identified by MALDI-TOF MS. A genetic algorithm model with ClinProTools software was generated using spectra of 30 VREF isolates and 30 VSEF isolates. Using this model, 90 test isolates were discriminated between VREF and VSEF. The results showed that all sixty VREF isolates carried the vanA gene. The performance of VREF detection by the genetic algorithm model of MALDI-TOF MS compared to the sequencing method was sensitivity = 80%, specificity = 90%, false positive rate =10%, false negative rate =10%, positive predictive value = 80%, negative predictive value= 90%. MALDI-TOF MS can be used as a screening test for discrimination between vanA-positive E. faecium and vanA-negative E. faecium.

  10. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  11. Use of electronic data and existing screening tools to identify clinically significant obstructive sleep apnea.

    PubMed

    Severson, Carl A; Pendharkar, Sachin R; Ronksley, Paul E; Tsai, Willis H

    2015-01-01

    To assess the ability of electronic health data and existing screening tools to identify clinically significant obstructive sleep apnea (OSA), as defined by symptomatic or severe OSA. The present retrospective cohort study of 1041 patients referred for sleep diagnostic testing was undertaken at a tertiary sleep centre in Calgary, Alberta. A diagnosis of clinically significant OSA or an alternative sleep diagnosis was assigned to each patient through blinded independent chart review by two sleep physicians. Predictive variables were identified from online questionnaire data, and diagnostic algorithms were developed. The performance of electronically derived algorithms for identifying patients with clinically significant OSA was determined. Diagnostic performance of these algorithms was compared with versions of the STOP-Bang questionnaire and adjusted neck circumference score (ANC) derived from electronic data. Electronic questionnaire data were highly sensitive (>95%) at identifying clinically significant OSA, but not specific. Sleep diagnostic testing-determined respiratory disturbance index was very specific (specificity ≥95%) for clinically relevant disease, but not sensitive (<35%). Derived algorithms had similar accuracy to the STOP-Bang or ANC, but required fewer questions and calculations. These data suggest that a two-step process using a small number of clinical variables (maximizing sensitivity) and objective diagnostic testing (maximizing specificity) is required to identify clinically significant OSA. When used in an online setting, simple algorithms can identify clinically relevant OSA with similar performance to existing decision rules such as the STOP-Bang or ANC.

  12. Inferring Regulatory Networks by Combining Perturbation Screens and Steady State Gene Expression Profiles

    PubMed Central

    Michailidis, George

    2014-01-01

    Reconstructing transcriptional regulatory networks is an important task in functional genomics. Data obtained from experiments that perturb genes by knockouts or RNA interference contain useful information for addressing this reconstruction problem. However, such data can be limited in size and/or are expensive to acquire. On the other hand, observational data of the organism in steady state (e.g., wild-type) are more readily available, but their informational content is inadequate for the task at hand. We develop a computational approach to appropriately utilize both data sources for estimating a regulatory network. The proposed approach is based on a three-step algorithm to estimate the underlying directed but cyclic network, that uses as input both perturbation screens and steady state gene expression data. In the first step, the algorithm determines causal orderings of the genes that are consistent with the perturbation data, by combining an exhaustive search method with a fast heuristic that in turn couples a Monte Carlo technique with a fast search algorithm. In the second step, for each obtained causal ordering, a regulatory network is estimated using a penalized likelihood based method, while in the third step a consensus network is constructed from the highest scored ones. Extensive computational experiments show that the algorithm performs well in reconstructing the underlying network and clearly outperforms competing approaches that rely only on a single data source. Further, it is established that the algorithm produces a consistent estimate of the regulatory network. PMID:24586224

  13. Developing Family Healthware, a family history screening tool to prevent common chronic diseases.

    PubMed

    Yoon, Paula W; Scheuner, Maren T; Jorgensen, Cynthia; Khoury, Muin J

    2009-01-01

    Family health history reflects the effects of genetic, environmental, and behavioral factors and is an important risk factor for a variety of disorders including coronary heart disease, cancer, and diabetes. In 2004, the Centers for Disease Control and Prevention developed Family Healthware, a new interactive, Web-based tool that assesses familial risk for 6 diseases (coronary heart disease, stroke, diabetes, and colorectal, breast, and ovarian cancer) and provides a "prevention plan" with personalized recommendations for lifestyle changes and screening. The tool collects data on health behaviors, screening tests, and disease history of a person's first- and second-degree relatives. Algorithms in the software analyze the family history data and assess familial risk based on the number of relatives affected, their age at disease onset, their sex, how closely related the relatives are to each other and to the user, and the combinations of diseases in the family. A second set of algorithms uses the data on familial risk level, health behaviors, and screening to generate personalized prevention messages. Qualitative and quantitative formative research on lay understanding of family history and genetics helped shape the tool's content, labels, and messages. Lab-based usability testing helped refine messages and tool navigation. The tool is being evaluated by 3 academic centers by using a network of primary care practices to determine whether personalized prevention messages tailored to familial risk will motivate people at risk to change their lifestyles or screening behaviors.

  14. Performance of Implementing Guideline Driven Cervical Cancer Screening Measures in an Inner City Hospital System

    PubMed Central

    Wieland, Daryl L.; Reimers, Laura L.; Wu, Eijean; Nathan, Lisa M.; Gruenberg, Tammy; Abadi, Maria; Einstein, Mark H.

    2013-01-01

    Objective In 2006, the American Society for Colposcopy and Cervical Pathology (ASCCP) updated evidence based guidelines recommending screening intervals for women with abnormal cervical cytology. In our low-income inner city population, we sought to improve performance by uniformly applying the guidelines to all patients. We report the prospective performance of a comprehensive tracking, evidence-based algorithmically driven call-back and appointment scheduling system for cervical cancer screening in a resource-limited inner city population. Materials and Methods Outreach efforts were formalized with algorithm-based protocols for triage to colposcopy, with universal adherence to evidence-based guidelines. During implementation from August 2006 through July 2008, we prospectively tracked performance using the electronic medical record with administrative and pathology reports to determine performance variables such as the total number of Pap tests, colposcopy visits, and the distribution of abnormal cytology and histology results, including all CIN 2,3 diagnoses. Results 86,257 gynecologic visits and 41,527 Pap tests were performed system-wide during this period of widespread and uniform implementation of standard cervical cancer screening guidelines. The number of Pap tests performed per month varied little. The incidence of CIN 1 significantly decreased from 117/171 (68.4%) the first tracked month to 52/95 (54.7%) the last tracked month (p=0.04). The monthly incidence rate of CIN 2,3, including incident cervical cancers did not change. The total number of colposcopy visits declined, resulting in a 50% decrease in costs related to colposcopy services and approximately a 12% decrease in costs related to excisional biopsies. Conclusions Adherence to cervical cancer screening guidelines reduced the number of unnecessary colposcopies without increasing numbers of potentially missed CIN 2,3 lesions, including cervical cancer. Uniform implementation of administrative-based performance initiatives for cervical cancer screening minimizes differences in provider practices and maximizes performance of screening while containing cervical cancer screening costs. PMID:21959573

  15. Evaluation of a workplace hemochromatosis screening program.

    PubMed

    Stave, G M; Mignogna, J J; Powell, G S; Hunt, C M

    1999-05-01

    Hemochromatosis is a common inherited disorder of iron metabolism with significant health consequences for the employed population. Although screening for hemochromatosis has been recommended, workplace screening programs remain uncommon. In the first year of a newly initiated corporate screening program, 1968 employees were tested. The screening algorithm included measurement of serum iron and transferrin and subsequent ferritin levels in those employees with elevated iron/transferrin ratios. Thirteen percent of men and 21% of women had elevated iron/transferrin ratios. Of these, 14 men and 2 women had elevated ferritin levels. Of these 16, three had liver biopsies and all three have hemochromatosis. The cost of the screening program was $27,850. The cost per diagnosis was $9283 and the cost per year of life saved was $928. These costs compare very favorably with other common workplace screening programs. Several barriers to obtaining definitive diagnoses on all patients with a positive screening result were identified; strategies to overcome these barriers would further enhance the cost effectiveness of the program. We conclude that workplace hemochromatosis screening is highly cost effective and should be incorporated into health promotion/disease prevention programs.

  16. Optimization of internet content filtering-Combined with KNN and OCAT algorithms

    NASA Astrophysics Data System (ADS)

    Guo, Tianze; Wu, Lingjing; Liu, Jiaming

    2018-04-01

    The face of the status quo that rampant illegal content in the Internet, the result of traditional way to filter information, keyword recognition and manual screening, is getting worse. Based on this, this paper uses OCAT algorithm nested by KNN classification algorithm to construct a corpus training library that can dynamically learn and update, which can be improved on the filter corpus for constantly updated illegal content of the network, including text and pictures, and thus can better filter and investigate illegal content and its source. After that, the research direction will focus on the simplified updating of recognition and comparison algorithms and the optimization of the corpus learning ability in order to improve the efficiency of filtering, save time and resources.

  17. Morphological feature detection for cervical cancer screening

    NASA Astrophysics Data System (ADS)

    Narayanswamy, Ramkumar; Sharpe, John P.; Duke, Heather J.; Stewart, Rosemary J.; Johnson, Kristina M.

    1995-03-01

    An optoelectronic system has been designed to pre-screen pap-smear slides and detect the suspicious cells using the hit/miss transform. Computer simulation of the algorithm tested on 184 pap-smear images detected 95% of the suspicious region as suspect while tagging just 5% of the normal regions as suspect. An optoelectronic implementation of the hit/miss transform using a 4f Vander-Lugt correlator architecture is proposed and demonstrated with experimental results.

  18. Passive microwave remote sensing of rainfall with SSM/I: Algorithm development and implementation

    NASA Technical Reports Server (NTRS)

    Ferriday, James G.; Avery, Susan K.

    1994-01-01

    A physically based algorithm sensitive to emission and scattering is used to estimate rainfall using the Special Sensor Microwave/Imager (SSM/I). The algorithm is derived from radiative transfer calculations through an atmospheric cloud model specifying vertical distributions of ice and liquid hydrometeors as a function of rain rate. The algorithm is structured in two parts: SSM/I brightness temperatures are screened to detect rainfall and are then used in rain-rate calculation. The screening process distinguishes between nonraining background conditions and emission and scattering associated with hydrometeors. Thermometric temperature and polarization thresholds determined from the radiative transfer calculations are used to detect rain, whereas the rain-rate calculation is based on a linear function fit to a linear combination of channels. Separate calculations for ocean and land account for different background conditions. The rain-rate calculation is constructed to respond to both emission and scattering, reduce extraneous atmospheric and surface effects, and to correct for beam filling. The resulting SSM/I rain-rate estimates are compared to three precipitation radars as well as to a dynamically simulated rainfall event. Global estimates from the SSM/I algorithm are also compared to continental and shipboard measurements over a 4-month period. The algorithm is found to accurately describe both localized instantaneous rainfall events and global monthly patterns over both land and ovean. Over land the 4-month mean difference between SSM/I and the Global Precipitation Climatology Center continental rain gauge database is less than 10%. Over the ocean, the mean difference between SSM/I and the Legates and Willmott global shipboard rain gauge climatology is less than 20%.

  19. Report of the Defense Science Board Summer Study on Joint Service Acquisition Programs

    DTIC Science & Technology

    1984-02-01

    ENGINEERING Washington, D. C. 20301 r! OFFICE OF THE SECRETARY OF DEFENSE WASHINGTON, D.C. 20301 I 17 April 1984S DE FENSIE SCIENCE BOARD ?M~1RAN"~’ FOR i1Z...mended. Action: JCS, Services, and OSD prepare a JKN3 cur-Ler for Secretary De -fense approval. °’ B. NEW 5000 SERIES DIRECTIVE fRdcdmmmendet tion...requirements, program, acii ecliolv t lc., ron those candidates which possess the prere<isites for suct-e -ces I Joia •, Service acquisitions. The decision to

  20. Geometrie verstehen: statisch - kinematisch

    NASA Astrophysics Data System (ADS)

    Kroll, Ekkehard

    Dem Allgemeinen steht begrifflich das Besondere gegenüber. In diesem Sinne sind allgemeine Überlegungen zum Verstehen von Mathematik zu ergänzen durch Untersuchungen hinsichtlich des Verstehens der einzelnen mathematischen Disziplinen, insbesondere der Geometrie. Hier haben viele Schülerinnen und Schüler Probleme. Diese rühren hauptsächlich daher, dass eine fertige geometrische Konstruktion in ihrer statischen Präsentation auf Papier nicht mehr die einzelnen Konstruktionsschritte erkennen lässt; zum Nachvollzug müssen sie daher ergänzend in einer Konstruktionsbeschreibung festgehalten werden.

  1. Probleme aus der Physik

    NASA Astrophysics Data System (ADS)

    Vogel, Helmut

    Das beliebte Arbeitsbuch "Probleme aus der Physik" bietet nun auch zur 17. Auflage von Gerthsen Vogel "Physik" (ISBN 3-540-56638-4) mit über 1150 gelösten Aufgaben aus der Physik und ihren Anwendungen in Technik, Astrophysik, Geound Biowissenschaften eine Fülle an Material zum Üben und Weiterlernen, zur Prüfungsvorbereitung und zum Selbststudium. Neu hinzugekommen ist ein Kapitel zur nichtlinearen Dynamik. Aufgaben aller Schwierigkeitsgrade machen "Probleme aus der Physik" unentbehrlich für Studenten der Physik im Haupt- und Nebenfach; Schüler der Leistungskurse Physik finden hier eine hervorragende Ergänzung.

  2. Structure-Based Virtual Screening of Commercially Available Compound Libraries.

    PubMed

    Kireev, Dmitri

    2016-01-01

    Virtual screening (VS) is an efficient hit-finding tool. Its distinctive strength is that it allows one to screen compound libraries that are not available in the lab. Moreover, structure-based (SB) VS also enables an understanding of how the hit compounds bind the protein target, thus laying ground work for the rational hit-to-lead progression. SBVS requires a very limited experimental effort and is particularly well suited for academic labs and small biotech companies that, unlike pharmaceutical companies, do not have physical access to quality small-molecule libraries. Here, we describe SBVS of commercial compound libraries for Mer kinase inhibitors. The screening protocol relies on the docking algorithm Glide complemented by a post-docking filter based on structural protein-ligand interaction fingerprints (SPLIF).

  3. Clinical Models and Algorithms for the Prediction of Retinopathy of Prematurity: A Report by the American Academy of Ophthalmology.

    PubMed

    Hutchinson, Amy K; Melia, Michele; Yang, Michael B; VanderVeen, Deborah K; Wilson, Lorri B; Lambert, Scott R

    2016-04-01

    To assess the accuracy with which available retinopathy of prematurity (ROP) predictive models detect clinically significant ROP and to what extent and at what risk these models allow for the reduction of screening examinations for ROP. A literature search of the PubMed and Cochrane Library databases was conducted last on May 1, 2015, and yielded 305 citations. After screening the abstracts of all 305 citations and reviewing the full text of 30 potentially eligible articles, the panel members determined that 22 met the inclusion criteria. One article included 2 studies, for a total of 23 studies reviewed. The panel extracted information about study design, study population, the screening algorithm tested, interventions, outcomes, and study quality. The methodologist divided the studies into 2 categories-model development and model validation-and assigned a level of evidence rating to each study. One study was rated level I evidence, 3 studies were rated level II evidence, and 19 studies were rated level III evidence. In some cohorts, some models would have allowed reductions in the number of infants screened for ROP without failing to identify infants requiring treatment. However, the small sample size and limited generalizability of the ROP predictive models included in this review preclude their widespread use to make all-or-none decisions about whether to screen individual infants for ROP. As an alternative, some studies proposed approaches to apply the models to reduce the number of examinations performed in low-risk infants. Additional research is needed to optimize ROP predictive model development, validation, and application before such models can be used widely to reduce the burdensome number of ROP screening examinations. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  4. Impact of nucleic acid testing relative to antigen/antibody combination immunoassay on the detection of acute HIV infection.

    PubMed

    De Souza, Mark S; Phanuphak, Nittaya; Pinyakorn, Suteeraporn; Trichavaroj, Rapee; Pattanachaiwit, Supanit; Chomchey, Nitiya; Fletcher, James L; Kroon, Eugene D; Michael, Nelson L; Phanuphak, Praphan; Kim, Jerome H; Ananworanich, Jintanat

    2015-04-24

    To assess the addition of HIV nucleic acid testing (NAT) to fourth-generation (4thG) HIV antigen/antibody combination immunoassay in improving detection of acute HIV infection (AHI). Participants attending a major voluntary counseling and testing site in Thailand were screened for AHI using 4thG HIV antigen/antibody immunoassay and sequential less sensitive HIV antibody immunoassay. Samples nonreactive by 4thG antigen/antibody immunoassay were further screened using pooled NAT to identify additional AHI. HIV infection status was verified following enrollment into an AHI study with follow-up visits and additional diagnostic tests. Among 74 334 clients screened for HIV infection, HIV prevalence was 10.9% and the overall incidence of AHI (N = 112) was 2.2 per 100 person-years. The inclusion of pooled NAT in the testing algorithm increased the number of acutely infected patients detected, from 81 to 112 (38%), relative to 4thG HIV antigen/antibody immunoassay. Follow-up testing within 5 days of screening marginally improved the 4thG immunoassay detection rate (26%). The median CD4 T-cell count at the enrollment visit was 353 cells/μl and HIV plasma viral load was 598 289 copies/ml. The incorporation of pooled NAT into the HIV testing algorithm in high-risk populations may be beneficial in the long term. The addition of pooled NAT testing resulted in an increase in screening costs of 22% to identify AHI: from $8.33 per screened patient to $10.16. Risk factors of the testing population should be considered prior to NAT implementation given the additional testing complexity and costs.

  5. The price of performance: a cost and performance analysis of the implementation of cell-free fetal DNA testing for Down syndrome in Ontario, Canada.

    PubMed

    Okun, N; Teitelbaum, M; Huang, T; Dewa, C S; Hoch, J S

    2014-04-01

    To examine the cost and performance implications of introducing cell-free fetal DNA (cffDNA) testing within modeled scenarios in a publicly funded Canadian provincial Down syndrome (DS) prenatal screening program. Two clinical algorithms were created: the first to represent the current screening program and the second to represent one that incorporates cffDNA testing. From these algorithms, eight distinct scenarios were modeled to examine: (1) the current program (no cffDNA), (2) the current program with first trimester screening (FTS) as the nuchal translucency-based primary screen (no cffDNA), (3) a program substituting current screening with primary cffDNA, (4) contingent cffDNA with current FTS performance, (5) contingent cffDNA at a fixed price to result in overall cost neutrality,(6) contingent cffDNA with an improved detection rate (DR) of FTS, (7) contingent cffDNA with higher uptake of FTS, and (8) contingent cffDNA with optimized FTS (higher uptake and improved DR). This modeling study demonstrates that introducing contingent cffDNA testing improves performance by increasing the number of cases of DS detected prenatally, and reducing the number of amniocenteses performed and concomitant iatrogenic pregnancy loss of pregnancies not affected by DS. Costs are modestly increased, although the cost per case of DS detected is decreased with contingent cffDNA testing. Contingent models of cffDNA testing can improve overall screening performance while maintaining the provision of an 11- to 13-week scan. Costs are modestly increased, but cost per prenatally detected case of DS is decreased. © 2013 John Wiley & Sons, Ltd.

  6. Power of automated algorithms for combining time-line follow-back and urine drug screening test results in stimulant-abuse clinical trials.

    PubMed

    Oden, Neal L; VanVeldhuisen, Paul C; Wakim, Paul G; Trivedi, Madhukar H; Somoza, Eugene; Lewis, Daniel

    2011-09-01

    In clinical trials of treatment for stimulant abuse, researchers commonly record both Time-Line Follow-Back (TLFB) self-reports and urine drug screen (UDS) results. To compare the power of self-report, qualitative (use vs. no use) UDS assessment, and various algorithms to generate self-report-UDS composite measures to detect treatment differences via t-test in simulated clinical trial data. We performed Monte Carlo simulations patterned in part on real data to model self-report reliability, UDS errors, dropout, informatively missing UDS reports, incomplete adherence to a urine donation schedule, temporal correlation of drug use, number of days in the study period, number of patients per arm, and distribution of drug-use probabilities. Investigated algorithms include maximum likelihood and Bayesian estimates, self-report alone, UDS alone, and several simple modifications of self-report (referred to here as ELCON algorithms) which eliminate perceived contradictions between it and UDS. Among the algorithms investigated, simple ELCON algorithms gave rise to the most powerful t-tests to detect mean group differences in stimulant drug use. Further investigation is needed to determine if simple, naïve procedures such as the ELCON algorithms are optimal for comparing clinical study treatment arms. But researchers who currently require an automated algorithm in scenarios similar to those simulated for combining TLFB and UDS to test group differences in stimulant use should consider one of the ELCON algorithms. This analysis continues a line of inquiry which could determine how best to measure outpatient stimulant use in clinical trials (NIDA. NIDA Monograph-57: Self-Report Methods of Estimating Drug Abuse: Meeting Current Challenges to Validity. NTIS PB 88248083. Bethesda, MD: National Institutes of Health, 1985; NIDA. NIDA Research Monograph 73: Urine Testing for Drugs of Abuse. NTIS PB 89151971. Bethesda, MD: National Institutes of Health, 1987; NIDA. NIDA Research Monograph 167: The Validity of Self-Reported Drug Use: Improving the Accuracy of Survey Estimates. NTIS PB 97175889. GPO 017-024-01607-1. Bethesda, MD: National Institutes of Health, 1997).

  7. Comparison of Multispot EIA with Western blot for confirmatory serodiagnosis of HIV.

    PubMed

    Torian, Lucia V; Forgione, Lisa A; Punsalang, Amado E; Pirillo, Robert E; Oleszko, William R

    2011-12-01

    Recent improvements in the sensitivity of immunoassays (IA) used for HIV screening, coupled with increasing recognition of the importance of rapid point-of-care testing, have led to proposals to adjust the algorithm for serodiagnosis of HIV so that screening and confirmation can be performed using a dual or triple IA sequence that does not require Western blotting for confirmation. One IA that has been proposed as a second or confirmatory test is the Bio-Rad Multispot(®) Rapid HIV-1/HIV-2 Test. This test would have the added advantage of differentiating between HIV-1 and HIV-2 antibodies. To compare the sensitivity and type-specificity of an algorithm combining a 3rd generation enzyme immunoassay (EIA) followed by a confirmatory Multispot with the conventional algorithm that combines a 3rd generation EIA (Bio-Rad GS HIV-1/HIV-2 Plus O EIA) followed by confirmatory Western blot (Bio-Rad GS HIV-1 WB). 8760 serum specimens submitted for HIV testing to the New York City Public Health Laboratory between May 22, 2007, and April 30, 2010, tested repeatedly positive on 3rd generation HIV-1-2+O EIA screening and received parallel confirmatory testing by WB and Multispot (MS). 8678/8760 (99.1%) specimens tested WB-positive; 82 (0.9%) tested WB-negative or indeterminate (IND). 8690/8760 specimens (99.2%) tested MS-positive, of which 14 (17.1%) had been classified as negative or IND by WB. Among the HIV-1 WB-positive specimens, MS classified 26 (0.29%) as HIV-2. Among the HIV-1 WB negative and IND, MS detected 12 HIV-2. MS detected an additional 14 HIV-1 infections among WB negative or IND specimens, differentiated 26 HIV-1 WB positives as HIV-2, and detected 12 additional HIV-2 infections among WB negative/IND. A dual 3rd generation EIA algorithm incorporating MS had equivalent HIV-1 sensitivity to the 3rd generation EIA-WB algorithm and had the added advantage of detecting 12 HIV-2 specimens that were not HIV-1 WB cross-reactors. In this series an algorithm using EIA followed by MS would have resulted in the expedited referral of 38 specimens for HIV-2 testing and 40 specimens for nucleic acid confirmation. Further testing using a combined gold standard of nucleic acid detection and WB is needed to calculate specificity and validate the substitution of MS for WB in the diagnostic algorithm used by a large public health laboratory. Copyright © 2011. Published by Elsevier B.V.

  8. Management of Asymptomatic Renal Stones in Astronauts

    NASA Technical Reports Server (NTRS)

    Reyes, David; Locke, James

    2016-01-01

    Introduction: Management guidelines were created to screen and manage asymptomatic renal stones in U.S. astronauts. The risks for renal stone formation in astronauts due to bone loss and hypercalcuria are unknown. Astronauts have a stone risk which is about the same as commercial aviation pilots, which is about half that of the general population. However, proper management of this condition is still crucial to mitigate health and mission risks in the spaceflight environment. Methods: An extensive review of the literature and current aeromedical standards for the monitoring and management of renal stones was done. The NASA Flight Medicine Clinic's electronic medical record and Longitudinal Survey of Astronaut Health were also reviewed. Using this work, a screening and management algorithm was created that takes into consideration the unique operational environment of spaceflight. Results: Renal stone screening and management guidelines for astronauts were created based on accepted standards of care, with consideration to the environment of spaceflight. In the proposed algorithm, all astronauts will receive a yearly screening ultrasound for renal calcifications, or mineralized renal material (MRM). Any areas of MRM, 3 millimeters or larger, are considered a positive finding. Three millimeters approaches the detection limit of standard ultrasound, and several studies have shown that any stone that is 3 millimeters or less has an approximately 95 percent chance of spontaneous passage. For mission-assigned astronauts, any positive ultrasound study is followed by low-dose renal computed tomography (CT) scan, and flexible ureteroscopy if CT is positive. Other specific guidelines were also created. Discussion: The term "MRM" is used to account for small areas of calcification that may be outside the renal collecting system, and allows objectivity without otherwise constraining the diagnostic and treatment process for potentially very small calcifications of uncertain significance. However, a small asymptomatic MRM or stone within the renal collecting system may become symptomatic, and so affect launch and flight schedules, cause incapacitation during flight, and ultimately require medical evacuation. For exploration class missions, evacuation is unlikely. The new screening and management algorithm allows better management of mission risks, and will define the true incidence of renal stones in U.S. astronauts. This information will be used to refine future screening, countermeasures and treatment methods; and will also inform the needed capabilities to be flown on exploration-class missions.

  9. TU-G-204-09: The Effects of Reduced- Dose Lung Cancer Screening CT On Lung Nodule Detection Using a CAD Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S; Lo, P; Kim, G

    2015-06-15

    Purpose: While Lung Cancer Screening CT is being performed at low doses, the purpose of this study was to investigate the effects of further reducing dose on the performance of a CAD nodule-detection algorithm. Methods: We selected 50 cases from our local database of National Lung Screening Trial (NLST) patients for which we had both the image series and the raw CT data from the original scans. All scans were acquired with fixed mAs (25 for standard-sized patients, 40 for large patients) on a 64-slice scanner (Sensation 64, Siemens Healthcare). All images were reconstructed with 1-mm slice thickness, B50 kernel.more » 10 of the cases had at least one nodule reported on the NLST reader forms. Based on a previously-published technique, we added noise to the raw data to simulate reduced-dose versions of each case at 50% and 25% of the original NLST dose (i.e. approximately 1.0 and 0.5 mGy CTDIvol). For each case at each dose level, the CAD detection algorithm was run and nodules greater than 4 mm in diameter were reported. These CAD results were compared to “truth”, defined as the approximate nodule centroids from the NLST reports. Subject-level mean sensitivities and false-positive rates were calculated for each dose level. Results: The mean sensitivities of the CAD algorithm were 35% at the original dose, 20% at 50% dose, and 42.5% at 25% dose. The false-positive rates, in decreasing-dose order, were 3.7, 2.9, and 10 per case. In certain cases, particularly in larger patients, there were severe photon-starvation artifacts, especially in the apical region due to the high-attenuating shoulders. Conclusion: The detection task was challenging for the CAD algorithm at all dose levels, including the original NLST dose. However, the false-positive rate at 25% dose approximately tripled, suggesting a loss of CAD robustness somewhere between 0.5 and 1.0 mGy. NCI grant U01 CA181156 (Quantitative Imaging Network); Tobacco Related Disease Research Project grant 22RT-0131.« less

  10. Introducing Bayesian thinking to high-throughput screening for false-negative rate estimation.

    PubMed

    Wei, Xin; Gao, Lin; Zhang, Xiaolei; Qian, Hong; Rowan, Karen; Mark, David; Peng, Zhengwei; Huang, Kuo-Sen

    2013-10-01

    High-throughput screening (HTS) has been widely used to identify active compounds (hits) that bind to biological targets. Because of cost concerns, the comprehensive screening of millions of compounds is typically conducted without replication. Real hits that fail to exhibit measurable activity in the primary screen due to random experimental errors will be lost as false-negatives. Conceivably, the projected false-negative rate is a parameter that reflects screening quality. Furthermore, it can be used to guide the selection of optimal numbers of compounds for hit confirmation. Therefore, a method that predicts false-negative rates from the primary screening data is extremely valuable. In this article, we describe the implementation of a pilot screen on a representative fraction (1%) of the screening library in order to obtain information about assay variability as well as a preliminary hit activity distribution profile. Using this training data set, we then developed an algorithm based on Bayesian logic and Monte Carlo simulation to estimate the number of true active compounds and potential missed hits from the full library screen. We have applied this strategy to five screening projects. The results demonstrate that this method produces useful predictions on the numbers of false negatives.

  11. Early Detection of Ovarian Cancer using the Risk of Ovarian Cancer Algorithm with Frequent CA125 Testing in Women at Increased Familial Risk - Combined Results from Two Screening Trials.

    PubMed

    Skates, Steven J; Greene, Mark H; Buys, Saundra S; Mai, Phuong L; Brown, Powel; Piedmonte, Marion; Rodriguez, Gustavo; Schorge, John O; Sherman, Mark; Daly, Mary B; Rutherford, Thomas; Brewster, Wendy R; O'Malley, David M; Partridge, Edward; Boggess, John; Drescher, Charles W; Isaacs, Claudine; Berchuck, Andrew; Domchek, Susan; Davidson, Susan A; Edwards, Robert; Elg, Steven A; Wakeley, Katie; Phillips, Kelly-Anne; Armstrong, Deborah; Horowitz, Ira; Fabian, Carol J; Walker, Joan; Sluss, Patrick M; Welch, William; Minasian, Lori; Horick, Nora K; Kasten, Carol H; Nayfield, Susan; Alberts, David; Finkelstein, Dianne M; Lu, Karen H

    2017-07-15

    Purpose: Women at familial/genetic ovarian cancer risk often undergo screening despite unproven efficacy. Research suggests each woman has her own CA125 baseline; significant increases above this level may identify cancers earlier than standard 6- to 12-monthly CA125 > 35 U/mL. Experimental Design: Data from prospective Cancer Genetics Network and Gynecologic Oncology Group trials, which screened 3,692 women (13,080 woman-screening years) with a strong breast/ovarian cancer family history or BRCA1/2 mutations, were combined to assess a novel screening strategy. Specifically, serum CA125 q3 months, evaluated using a risk of ovarian cancer algorithm (ROCA), detected significant increases above each subject's baseline, which triggered transvaginal ultrasound. Specificity and positive predictive value (PPV) were compared with levels derived from general population screening (specificity 90%, PPV 10%), and stage-at-detection was compared with historical high-risk controls. Results: Specificity for ultrasound referral was 92% versus 90% ( P = 0.0001), and PPV was 4.6% versus 10% ( P > 0.10). Eighteen of 19 malignant ovarian neoplasms [prevalent = 4, incident = 6, risk-reducing salpingo-oophorectomy (RRSO) = 9] were detected via screening or RRSO. Among incident cases (which best reflect long-term screening performance), three of six invasive cancers were early-stage (I/II; 50% vs. 10% historical BRCA1 controls; P = 0.016). Six of nine RRSO-related cases were stage I. ROCA flagged three of six (50%) incident cases before CA125 exceeded 35 U/mL. Eight of nine patients with stages 0/I/II ovarian cancer were alive at last follow-up (median 6 years). Conclusions: For screened women at familial/genetic ovarian cancer risk, ROCA q3 months had better early-stage sensitivity at high specificity, and low yet possibly acceptable PPV compared with CA125 > 35 U/mL q6/q12 months, warranting further larger cohort evaluation. Clin Cancer Res; 23(14); 3628-37. ©2017 AACR . ©2017 American Association for Cancer Research.

  12. Orbit Clustering Based on Transfer Cost

    NASA Technical Reports Server (NTRS)

    Gustafson, Eric D.; Arrieta-Camacho, Juan J.; Petropoulos, Anastassios E.

    2013-01-01

    We propose using cluster analysis to perform quick screening for combinatorial global optimization problems. The key missing component currently preventing cluster analysis from use in this context is the lack of a useable metric function that defines the cost to transfer between two orbits. We study several proposed metrics and clustering algorithms, including k-means and the expectation maximization algorithm. We also show that proven heuristic methods such as the Q-law can be modified to work with cluster analysis.

  13. Lung boundary detection in pediatric chest x-rays

    NASA Astrophysics Data System (ADS)

    Candemir, Sema; Antani, Sameer; Jaeger, Stefan; Browning, Renee; Thoma, George R.

    2015-03-01

    Tuberculosis (TB) is a major public health problem worldwide, and highly prevalent in developing countries. According to the World Health Organization (WHO), over 95% of TB deaths occur in low- and middle- income countries that often have under-resourced health care systems. In an effort to aid population screening in such resource challenged settings, the U.S. National Library of Medicine has developed a chest X-ray (CXR) screening system that provides a pre-decision on pulmonary abnormalities. When the system is presented with a digital CXR image from the Picture Archive and Communication Systems (PACS) or an imaging source, it automatically identifies the lung regions in the image, extracts image features, and classifies the image as normal or abnormal using trained machine-learning algorithms. The system has been trained on adult CXR images, and this article presents enhancements toward including pediatric CXR images. Our adult lung boundary detection algorithm is model-based. We note the lung shape differences during pediatric developmental stages, and adulthood, and propose building new lung models suitable for pediatric developmental stages. In this study, we quantify changes in lung shape from infancy to adulthood toward enhancing our lung segmentation algorithm. Our initial findings suggest pediatric age groupings of 0 - 23 months, 2 - 10 years, and 11 - 18 years. We present justification for our groupings. We report on the quality of boundary detection algorithm with the pediatric lung models.

  14. Many amino acid substitution variants identified in DNA repair genes during human population screenings are predicted to impact protein function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xi, T; Jones, I M; Mohrenweiser, H W

    2003-11-03

    Over 520 different amino acid substitution variants have been previously identified in the systematic screening of 91 human DNA repair genes for sequence variation. Two algorithms were employed to predict the impact of these amino acid substitutions on protein activity. Sorting Intolerant From Tolerant (SIFT) classified 226 of 508 variants (44%) as ''Intolerant''. Polymorphism Phenotyping (PolyPhen) classed 165 of 489 amino acid substitutions (34%) as ''Probably or Possibly Damaging''. Another 9-15% of the variants were classed as ''Potentially Intolerant or Damaging''. The results from the two algorithms are highly associated, with concordance in predicted impact observed for {approx}62% of themore » variants. Twenty one to thirty one percent of the variant proteins are predicted to exhibit reduced activity by both algorithms. These variants occur at slightly lower individual allele frequency than do the variants classified as ''Tolerant'' or ''Benign''. Both algorithms correctly predicted the impact of 26 functionally characterized amino acid substitutions in the APE1 protein on biochemical activity, with one exception. It is concluded that a substantial fraction of the missense variants observed in the general human population are functionally relevant. These variants are expected to be the molecular genetic and biochemical basis for the associations of reduced DNA repair capacity phenotypes with elevated cancer risk.« less

  15. Type 2 Diabetes Screening Test by Means of a Pulse Oximeter.

    PubMed

    Moreno, Enrique Monte; Lujan, Maria Jose Anyo; Rusinol, Montse Torrres; Fernandez, Paqui Juarez; Manrique, Pilar Nunez; Trivino, Cristina Aragon; Miquel, Magda Pedrosa; Rodriguez, Marife Alvarez; Burguillos, M Jose Gonzalez

    2017-02-01

    In this paper, we propose a method for screening for the presence of type 2 diabetes by means of the signal obtained from a pulse oximeter. The screening system consists of two parts: the first analyzes the signal obtained from the pulse oximeter, and the second consists of a machine-learning module. The system consists of a front end that extracts a set of features form the pulse oximeter signal. These features are based on physiological considerations. The set of features were the input of a machine-learning algorithm that determined the class of the input sample, i.e., whether the subject had diabetes or not. The machine-learning algorithms were random forests, gradient boosting, and linear discriminant analysis as benchmark. The system was tested on a database of [Formula: see text] subjects (two samples per subject) collected from five community health centers. The mean receiver operating characteristic area found was [Formula: see text]% (median value [Formula: see text]% and range [Formula: see text]%), with a specificity =  [Formula: see text]% for a threshold that gave a sensitivity = [Formula: see text]%. We present a screening method for detecting diabetes that has a performance comparable to the glycated haemoglobin (haemoglobin A1c HbA1c) test, does not require blood extraction, and yields results in less than 5 min.

  16. Validation of the Saskatoon Falls Prevention Consortium's Falls Screening and Referral Algorithm

    PubMed Central

    Lawson, Sara Nicole; Zaluski, Neal; Petrie, Amanda; Arnold, Cathy; Basran, Jenny

    2013-01-01

    ABSTRACT Purpose: To investigate the concurrent validity of the Saskatoon Falls Prevention Consortium's Falls Screening and Referral Algorithm (FSRA). Method: A total of 29 older adults (mean age 77.7 [SD 4.0] y) residing in an independent-living senior's complex who met inclusion criteria completed a demographic questionnaire and the components of the FSRA and Berg Balance Scale (BBS). The FSRA consists of the Elderly Fall Screening Test (EFST) and the Multi-factor Falls Questionnaire (MFQ); it is designed to categorize individuals into low, moderate, or high fall-risk categories to determine appropriate management pathways. A predictive model for probability of fall risk, based on previous research, was used to determine concurrent validity of the FSRI. Results: The FSRA placed 79% of participants into the low-risk category, whereas the predictive model found the probability of fall risk to range from 0.04 to 0.74, with a mean of 0.35 (SD 0.25). No statistically significant correlation was found between the FSRA and the predictive model for probability of fall risk (Spearman's ρ=0.35, p=0.06). Conclusion: The FSRA lacks concurrent validity relative to to a previously established model of fall risk and appears to over-categorize individuals into the low-risk group. Further research on the FSRA as an adequate tool to screen community-dwelling older adults for fall risk is recommended. PMID:24381379

  17. Andriod Device-Based Cervical Cancer Screening for Resource-Poor Settings.

    PubMed

    Kudva, Vidya; Prasad, Keerthana; Guruvare, Shyamala

    2018-05-18

    Visual inspection with acetic acid (VIA) is an effective, affordable and simple test for cervical cancer screening in resource-poor settings. But considerable expertise is needed to differentiate cancerous lesions from normal lesions, which is lacking in developing countries. Many studies have attempted automation of cervical cancer detection from cervix images acquired during the VIA process. These studies used images acquired through colposcopy or cervicography. However, colposcopy is expensive and hence is not feasible as a screening tool in resource-poor settings. Cervicography uses a digital camera to acquire cervix images which are subsequently sent to experts for evaluation. Hence, cervicography does not provide a real-time decision of whether the cervix is normal or not, during the VIA examination. In case the cervix is found to be abnormal, the patient may be referred to a hospital for further evaluation using Pap smear and/or biopsy. An android device with an inbuilt app to acquire images and provide instant results would be an obvious choice in resource-poor settings. In this paper, we propose an algorithm for analysis of cervix images acquired using an android device, which can be used for the development of decision support system to provide instant decision during cervical cancer screening. This algorithm offers an accuracy of 97.94%, a sensitivity of 99.05% and specificity of 97.16%.

  18. [Evaluation of the benefit of different complementary exams in the search for a TB diagnosis algorithm for HIV patients put on ART in Niamey, Niger].

    PubMed

    Ouedraogo, E; Lurton, G; Mohamadou, S; Dillé, I; Diallo, I; Mamadou, S; Adehossi, E; Hanki, Y; Tchousso, O; Arzika, M; Gazeré, O; Amadou, F; Illo, N; Abdourahmane, Y; Idé, M; Alhousseini, Z; Lamontagne, F; Deze, C; D'Ortenzio, E; Diallo, S

    2016-12-01

    In Niger, the tuberculosis (TB) screening among people living with human immunodeficiency virus (HIV) (PLHIV) is nonsystematic and the use of additional tests is very often limited. The objective of this research is to evaluate the performance and the cost-effectiveness of various paraclinical testing strategies of TB among adult patients with HIV, using available tests in routine for patients cared in Niamey. This is a multicentric prospective intervention study performed in Niamey between 2010 and 2013. TB screening has been sought in newly diagnosed PLHIV, before ART treatment, performing consistently: a sputum examination by MZN (Ziehl-Nielsen staining) and microscopy fluorescence (MIF), chest radiography (CR), and abdominal ultrasound. The performance of these different tests was calculated using sputum culture as a gold standard. The various examinations were then combined in different algorithms. The cost-effectiveness of different algorithms was assessed by calculating the money needed to prevent a patient, put on ART, dying of TB. Between November 2010 and November 2012, 509 PLHIV were included. TB was diagnosed in 78 patients (15.3%), including 35 pulmonary forms, 24 ganglion, and 19 multifocal. The sensitivity of the evaluated algorithms varied between 0.35 and 0.85. The specificity ranged from 0.85 to 0.97. The most costeffective algorithm was the one involving MIF and CR. We recommend implementing a systematic and free direct examination of sputum by MIF and a CR for the detection of TB among newly diagnosed PLHIV in Niger.

  19. Estimation of non-solid lung nodule volume with low-dose CT protocols: effect of reconstruction algorithm and measurement method

    NASA Astrophysics Data System (ADS)

    Gavrielides, Marios A.; DeFilippo, Gino; Berman, Benjamin P.; Li, Qin; Petrick, Nicholas; Schultz, Kurt; Siegelman, Jenifer

    2017-03-01

    Computed tomography is primarily the modality of choice to assess stability of nonsolid pulmonary nodules (sometimes referred to as ground-glass opacity) for three or more years, with change in size being the primary factor to monitor. Since volume extracted from CT is being examined as a quantitative biomarker of lung nodule size, it is important to examine factors affecting the performance of volumetric CT for this task. More specifically, the effect of reconstruction algorithms and measurement method in the context of low-dose CT protocols has been an under-examined area of research. In this phantom study we assessed volumetric CT with two different measurement methods (model-based and segmentation-based) for nodules with radiodensities of both nonsolid (-800HU and -630HU) and solid (-10HU) nodules, sizes of 5mm and 10mm, and two different shapes (spherical and spiculated). Imaging protocols included CTDIvol typical of screening (1.7mGy) and sub-screening (0.6mGy) scans and different types of reconstruction algorithms across three scanners. Results showed that radio-density was the factor contributing most to overall error based on ANOVA. The choice of reconstruction algorithm or measurement method did not affect substantially the accuracy of measurements; however, measurement method affected repeatability with repeatability coefficients ranging from around 3-5% for the model-based estimator to around 20-30% across reconstruction algorithms for the segmentation-based method. The findings of the study can be valuable toward developing standardized protocols and performance claims for nonsolid nodules.

  20. Novel flowcytometry-based approach of malignant cell detection in body fluids using an automated hematology analyzer.

    PubMed

    Ai, Tomohiko; Tabe, Yoko; Takemura, Hiroyuki; Kimura, Konobu; Takahashi, Toshihiro; Yang, Haeun; Tsuchiya, Koji; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Ohsaka, Akimichi

    2018-01-01

    Morphological microscopic examinations of nucleated cells in body fluid (BF) samples are performed to screen malignancy. However, the morphological differentiation is time-consuming and labor-intensive. This study aimed to develop a new flowcytometry-based gating analysis mode "XN-BF gating algorithm" to detect malignant cells using an automated hematology analyzer, Sysmex XN-1000. XN-BF mode was equipped with WDF white blood cell (WBC) differential channel. We added two algorithms to the WDF channel: Rule 1 detects larger and clumped cell signals compared to the leukocytes, targeting the clustered malignant cells; Rule 2 detects middle sized mononuclear cells containing less granules than neutrophils with similar fluorescence signal to monocytes, targeting hematological malignant cells and solid tumor cells. BF samples that meet, at least, one rule were detected as malignant. To evaluate this novel gating algorithm, 92 various BF samples were collected. Manual microscopic differentiation with the May-Grunwald Giemsa stain and WBC count with hemocytometer were also performed. The performance of these three methods were evaluated by comparing with the cytological diagnosis. The XN-BF gating algorithm achieved sensitivity of 63.0% and specificity of 87.8% with 68.0% for positive predictive value and 85.1% for negative predictive value in detecting malignant-cell positive samples. Manual microscopic WBC differentiation and WBC count demonstrated 70.4% and 66.7% of sensitivities, and 96.9% and 92.3% of specificities, respectively. The XN-BF gating algorithm can be a feasible tool in hematology laboratories for prompt screening of malignant cells in various BF samples.

  1. Lynch Syndrome: Female Genital Tract Cancer Diagnosis and Screening.

    PubMed

    Mills, Anne M; Longacre, Teri A

    2016-06-01

    Lynch syndrome is responsible for approximately 5% of endometrial cancers and 1% of ovarian cancers. The molecular basis for Lynch syndrome is a heritable functional deficiency in the DNA mismatch repair system, typically due to a germline mutation. This review discusses the rationales and relative merits of current Lynch syndrome screening tests for endometrial and ovarian cancers and provides pathologists with an informed algorithmic approach to Lynch syndrome testing in gynecologic cancers. Pitfalls in test interpretation and strategies to resolve discordant test results are presented. The potential role for next-generation sequencing panels in future screening efforts is discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A Monte-Carlo method which is not based on Markov chain algorithm, used to study electrostatic screening of ion potential

    NASA Astrophysics Data System (ADS)

    Šantić, Branko; Gracin, Davor

    2017-12-01

    A new simple Monte Carlo method is introduced for the study of electrostatic screening by surrounding ions. The proposed method is not based on the generally used Markov chain method for sample generation. Each sample is pristine and there is no correlation with other samples. As the main novelty, the pairs of ions are gradually added to a sample provided that the energy of each ion is within the boundaries determined by the temperature and the size of ions. The proposed method provides reliable results, as demonstrated by the screening of ion in plasma and in water.

  3. Mammography Patient Information at Hospital Websites: Most Neither Comprehensible Nor Guideline Supported.

    PubMed

    Sadigh, Gelareh; Singh, Kush; Gilbert, Kirven; Khan, Ramsha; Duszak, Abigail M; Duszak, Richard

    2016-11-01

    Ongoing controversy regarding screening mammography guidelines has created confusion for many patients. Given recommendations that patient educational material be prepared at or below the 7th grade reading level of average Americans, the purpose of this study was to assess the readability of online mammography information offered by hospitals nationwide. During 2015, online mammography patient educational materials were identified for all Medicare-recognized hospitals nationwide for which screening mammography metrics were publicly available. Patient educational materials were assessed using six validated readability score algorithms. All references to official screening guidelines were captured. Of 4105 hospitals nationwide, 3252 had websites and confirmable screening mammography services. Of those, 1753 (54%) offered mammography information material online. Only 919 (28%) referenced any professional society guidelines. After excluding information not formatted in HTML and shorter than 100 words (to improve algorithm reliability), 1524 hospital mammography webpages were assessed for grade level scores. Nationally, the mean of each readability score for all hospitals varied between the 10th and 14th grade levels, all higher than the recommended 7th grade level (p < 0.001). At the individual hospital level, only 14 hospitals (0.4%) had mean scores at or below the 7th grade level. Of U.S. hospitals that offer screening mammography and have websites, only 54% provide online mammography educational material. Of those, only 0.4% present information at a reading level comprehensible to average Americans, and only 28% offer specific information to help patients reconcile conflicting guidelines. Health systems offering mammography should strive to better meet women's health information and literacy needs.

  4. Radiation dose reduction for CT lung cancer screening using ASIR and MBIR: a phantom study.

    PubMed

    Mathieu, Kelsey B; Ai, Hua; Fox, Patricia S; Godoy, Myrna Cobos Barco; Munden, Reginald F; de Groot, Patricia M; Pan, Tinsu

    2014-03-06

    The purpose of this study was to reduce the radiation dosage associated with computed tomography (CT) lung cancer screening while maintaining overall diagnostic image quality and definition of ground-glass opacities (GGOs). A lung screening phantom and a multipurpose chest phantom were used to quantitatively assess the performance of two iterative image reconstruction algorithms (adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR)) used in conjunction with reduced tube currents relative to a standard clinical lung cancer screening protocol (51 effective mAs (3.9 mGy) and filtered back-projection (FBP) reconstruction). To further assess the algorithms' performances, qualitative image analysis was conducted (in the form of a reader study) using the multipurpose chest phantom, which was implanted with GGOs of two densities. Our quantitative image analysis indicated that tube current, and thus radiation dose, could be reduced by 40% or 80% from ASIR or MBIR, respectively, compared with conventional FBP, while maintaining similar image noise magnitude and contrast-to-noise ratio. The qualitative portion of our study, which assessed reader preference, yielded similar results, indicating that dose could be reduced by 60% (to 20 effective mAs (1.6 mGy)) with either ASIR or MBIR, while maintaining GGO definition. Additionally, the readers' preferences (as indicated by their ratings) regarding overall image quality were equal or better (for a given dose) when using ASIR or MBIR, compared with FBP. In conclusion, combining ASIR or MBIR with reduced tube current may allow for lower doses while maintaining overall diagnostic image quality, as well as GGO definition, during CT lung cancer screening.

  5. Using deep recurrent neural network for direct beam solar irradiance cloud screening

    NASA Astrophysics Data System (ADS)

    Chen, Maosi; Davis, John M.; Liu, Chaoshun; Sun, Zhibin; Zempila, Melina Maria; Gao, Wei

    2017-09-01

    Cloud screening is an essential procedure for in-situ calibration and atmospheric properties retrieval on (UV-)MultiFilter Rotating Shadowband Radiometer [(UV-)MFRSR]. Previous study has explored a cloud screening algorithm for direct-beam (UV-)MFRSR voltage measurements based on the stability assumption on a long time period (typically a half day or a whole day). To design such an algorithm requires in-depth understanding of radiative transfer and delicate data manipulation. Recent rapid developments on deep neural network and computation hardware have opened a window for modeling complicated End-to-End systems with a standardized strategy. In this study, a multi-layer dynamic bidirectional recurrent neural network is built for determining the cloudiness on each time point with a 17-year training dataset and tested with another 1-year dataset. The dataset is the daily 3-minute cosine corrected voltages, airmasses, and the corresponding cloud/clear-sky labels at two stations of the USDA UV-B Monitoring and Research Program. The results show that the optimized neural network model (3-layer, 250 hidden units, and 80 epochs of training) has an overall test accuracy of 97.87% (97.56% for the Oklahoma site and 98.16% for the Hawaii site). Generally, the neural network model grasps the key concept of the original model to use data in the entire day rather than short nearby measurements to perform cloud screening. A scrutiny of the logits layer suggests that the neural network model automatically learns a way to calculate a quantity similar to total optical depth and finds an appropriate threshold for cloud screening.

  6. Novel Virtual Screening Approach for the Discovery of Human Tyrosinase Inhibitors

    PubMed Central

    Ai, Ni; Welsh, William J.; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development. PMID:25426625

  7. Novel virtual screening approach for the discovery of human tyrosinase inhibitors.

    PubMed

    Ai, Ni; Welsh, William J; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development.

  8. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Brad M.; Nathan, Diane L.; Wang Yan

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., 'FOR PROCESSING') andmore » vendor postprocessed (i.e., 'FOR PRESENTATION'), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r= 0.82, p < 0.001) and processed (r= 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r= 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's {kappa}{>=} 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.« less

  9. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    PubMed Central

    Keller, Brad M.; Nathan, Diane L.; Wang, Yan; Zheng, Yuanjie; Gee, James C.; Conant, Emily F.; Kontos, Despina

    2012-01-01

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., “FOR PROCESSING”) and vendor postprocessed (i.e., “FOR PRESENTATION”), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies. PMID:22894417

  10. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation.

    PubMed

    Keller, Brad M; Nathan, Diane L; Wang, Yan; Zheng, Yuanjie; Gee, James C; Conant, Emily F; Kontos, Despina

    2012-08-01

    The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., "FOR PROCESSING") and vendor postprocessed (i.e., "FOR PRESENTATION"), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.

  11. Orbiting Carbon Observatory-2 (OCO-2) Cloud Screening; Validation Against Collocated MODIS and Initial Comparison to CALIOP Data

    NASA Technical Reports Server (NTRS)

    Taylor, Thomas E.; O'Dell, Christopher W.; Frankenberg, Christian; Partain, Philip; Cronk, Heather W.; Savtchenko, Andrey; Nelson, Robert R.; Rosenthal, Emily J.; Chang, Albert; Crisp, David; hide

    2015-01-01

    The retrieval of the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2 ) from satellite measurements of reflected sunlight in the near-infrared can be biased due to contamination by clouds and aerosols within the instrument's field of view (FOV). Therefore, accurate aerosol and cloud screening of soundings is required prior to their use in the computationally expensive XCO2 retrieval algorithm. Robust cloud screening methods have been an important focus of the retrieval algorithm team for the National Aeronautics and Space Administration (NASA) Orbiting Carbon Observatory-2 (OCO-2), which was successfully launched into orbit on July 2, 2014. Two distinct spectrally-based algorithms have been developed for the purpose of cloud clearing OCO-2 soundings. The A-Band Preprocessor (ABP) performs a retrieval of surface pressure using measurements in the 0.76 micron O2 A-band to distinguish changes in the expected photon path length. The Iterative Maximum A-Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) (IDP) algorithm is a non- scattering routine that operates on the O2 A-band as well as two CO2 absorption bands at 1.6 m (weak CO2 band) and 2.0 m (strong CO2 band) to provide band-dependent estimates of CO2 and H2O. Spectral ratios of retrieved CO2 and H2O identify measurements contaminated with cloud and scattering aerosols. Information from the two preprocessors is feed into a sounding selection tool to strategically down select from the order one million daily soundings collected by OCO-2 to a manageable number (order 10 to 20%) to be processed by the OCO-2 L2 XCO2 retrieval algorithm. Regional biases or errors in the selection of clear-sky soundings will introduce errors in the final retrieved XCO2 values, ultimately yielding errors in the flux inversion models used to determine global sources and sinks of CO2. In this work collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, and the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP), aboard the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) satellite, are used as a reference to access the accuracy and strengths and weaknesses of the OCO-2 screening algorithms. The combination of the ABP and IDP algorithms is shown to provide very robust and complimentary cloud filtering as compared to the results from MODIS and CALIOP. With idealized algorithm tuning to allow throughputs of 20-25%, correct classification of scenes, i.e., accuracies, are found to be ' 80-90% over several orbit repeat cycles in both the win ter and spring time for the three main viewing configurations of OCO-2; nadir-land, glint-land and glint-water. Investigation unveiled no major spatial or temporal dependencies, although slight differences in the seasonal data sets do exist and classification tends to be more problematic with increasing solar zenith angle and when surfaces are covered in snow and ice. An in depth analysis on both a simulated data set and real OCO-2 measurements against CALIOP highlight the strength of the ABP in identifying high, thin clouds while it often misses clouds near the surface even when the optical thickness is greater than 1. Fortunately, by combining the ABP with the IDP, the number of thick low clouds passing the preprocessors is partially mitigated.

  12. Radionuclide identification algorithm for organic scintillator-based radiation portal monitor

    NASA Astrophysics Data System (ADS)

    Paff, Marc Gerrit; Di Fulvio, Angela; Clarke, Shaun D.; Pozzi, Sara A.

    2017-03-01

    We have developed an algorithm for on-the-fly radionuclide identification for radiation portal monitors using organic scintillation detectors. The algorithm was demonstrated on experimental data acquired with our pedestrian portal monitor on moving special nuclear material and industrial sources at a purpose-built radiation portal monitor testing facility. The experimental data also included common medical isotopes. The algorithm takes the power spectral density of the cumulative distribution function of the measured pulse height distributions and matches these to reference spectra using a spectral angle mapper. F-score analysis showed that the new algorithm exhibited significant performance improvements over previously implemented radionuclide identification algorithms for organic scintillators. Reliable on-the-fly radionuclide identification would help portal monitor operators more effectively screen out the hundreds of thousands of nuisance alarms they encounter annually due to recent nuclear-medicine patients and cargo containing naturally occurring radioactive material. Portal monitor operators could instead focus on the rare but potentially high impact incidents of nuclear and radiological material smuggling detection for which portal monitors are intended.

  13. Adolescent THC exposure does not sensitize conditioned place preferences to subthreshold d-amphetamine in male and female rats.

    PubMed

    Keeley, Robin J; Bye, Cameron; Trow, Jan; McDonald, Robert J

    2018-01-01

    The acute effects of marijuana consumption on brain physiology and behaviour are well documented, but the long-term effects of its chronic use are less well known. Chronic marijuana use during adolescence is of increased interest, given that the majority of individuals first use marijuana during this developmental stage , and  adolescent marijuana use is thought to increase the susceptibility to abusing other drugs when exposed later in life. It is possible that marijuana use during critical periods in adolescence could lead to increased sensitivity to other drugs of abuse later on. To test this, we chronically administered ∆ 9 -tetrahydrocannabinol (THC) to male and female Long-Evans (LER) and Wistar (WR) rats directly after puberty onset. Rats matured to postnatal day 90 before being exposed to a conditioned place preference task (CPP). A subthreshold dose of d-amphetamine, found not to induce place preference in drug naïve rats, was used as the unconditioned stimulus. The effect of d-amphetamine on neural activity was inferred by quantifying cfos expression in the nucleus accumbens and dorsal hippocampus following CPP training. Chronic exposure to THC post-puberty had no potentiating effect on a subthreshold dose of d-amphetamine to induce CPP. No differences in cfos expression were observed. These results show that chronic exposure to THC during puberty did not increase sensitivity to d-amphetamine in adult LER and WR rats. This supports the concept that THC may not sensitize the response to all drugs of abuse.

  14. Fingerprinting the type of line edge roughness

    NASA Astrophysics Data System (ADS)

    Fernández Herrero, A.; Pflüger, M.; Scholze, F.; Soltwisch, V.

    2017-06-01

    Lamellar gratings are widely used diffractive optical elements and are prototypes of structural elements in integrated electronic circuits. EUV scatterometry is very sensitive to structure details and imperfections, which makes it suitable for the characterization of nanostructured surfaces. As compared to X-ray methods, EUV scattering allows for steeper angles of incidence, which is highly preferable for the investigation of small measurement fields on semiconductor wafers. For the control of the lithographic manufacturing process, a rapid in-line characterization of nanostructures is indispensable. Numerous studies on the determination of regular geometry parameters of lamellar gratings from optical and Extreme Ultraviolet (EUV) scattering also investigated the impact of roughness on the respective results. The challenge is to appropriately model the influence of structure roughness on the diffraction intensities used for the reconstruction of the surface profile. The impact of roughness was already studied analytically but for gratings with a periodic pseudoroughness, because of practical restrictions of the computational domain. Our investigation aims at a better understanding of the scattering caused by line roughness. We designed a set of nine lamellar Si-gratings to be studied by EUV scatterometry. It includes one reference grating with no artificial roughness added, four gratings with a periodic roughness distribution, two with a prevailing line edge roughness (LER) and another two with line width roughness (LWR), and four gratings with a stochastic roughness distribution (two with LER and two with LWR). We show that the type of line roughness has a strong impact on the diffuse scatter angular distribution. Our experimental results are not described well by the present modelling approach based on small, periodically repeated domains.

  15. Validity of 24-h recalls in (pre-)school aged children: comparison of proxy-reported energy intakes with measured energy expenditure.

    PubMed

    Börnhorst, C; Bel-Serrat, S; Pigeot, I; Huybrechts, I; Ottavaere, C; Sioen, I; De Henauw, S; Mouratidou, T; Mesana, M I; Westerterp, K; Bammann, K; Lissner, L; Eiben, G; Pala, V; Rayson, M; Krogh, V; Moreno, L A

    2014-02-01

    Little is known about the validity of repeated 24-h dietary recalls (24-HDR) as a measure of total energy intake (EI) in young children. This study aimed to evaluate the validity of proxy-reported EI by comparison with total energy expenditure (TEE) measured by the doubly labeled water (DLW) technique. The agreement between EI and TEE was investigated in 36 (47.2% boys) children aged 4-10 years from Belgium and Spain using subgroup analyses and Bland-Altman plots. Low-energy-reporters (LER), adequate-energy-reporters (AER) and high-energy-reporters (HER) were defined from the ratio of EI over TEE by application of age- and sex-specific cut-off values. There was good agreement between means of EI (1500 kcal/day) and TEE (1523 kcal/day) at group level though in single children, i.e. at the individual level, large differences were observed. Almost perfect agreement between EI and TEE was observed in thin/normal weight children (EI: 1511 kcal/day; TEE: 1513 kcal/day). Even in overweight/obese children the mean difference between EI and TEE was only -86 kcal/day. Among the participants, 28 (78%) were classified as AER, five (14%) as HER and three (8%) as LER. Two proxy-reported 24-HDRs were found to be a valid instrument to assess EI on group level but not on the individual level. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  16. [Design and validation of the scales for the assessment of the psychological impact of past life events: the role of ruminative thought and personal growth].

    PubMed

    Fernández-Fernández, Virginia; Márquez-González, María; Losada-Baltar, Andrés; García, Pablo E; Romero-Moreno, Rosa

    2013-01-01

    Older people's emotional distress is often related to rumination processes focused on past vital events occurred during their lives. The specific coping strategies displayed to face those events may contribute to explain older adults' current well-being: they can perceive that they have obtained personal growth after those events and/or they can show a tendency to have intrusive thoughts about those events. This paper describes the development and analysis of the psychometric properties of the Scales for the Assessment of the Psychological Impact of Past Life Events (SAPIPLE): the past life events-occurrence scale (LE-O), ruminative thought scale (LE-R) and personal growth scale (LE-PG). Participants were 393 community dwelling elderly (mean age=71.5 years old; SD=6.9). In addition to the SAPIPLE scales, depressive symptomatology, anxiety, psychological well-being, life satisfaction, physical function and vitality have been assessed. The inter-rater agreement's analysis suggests the presence of two factors in the LE-O: positive and negative vital events. Confirmatory Factor Analysis (CFA) supported this two-dimensional structure for both the LE-R and the LE-PG. Good internal consistency indexes have been obtained for each scale and subscale, as well as good criterion and concurrent validity indexes. Both ruminative thoughts about past life events and personal growth following those events are related to older adults' current well-being. The SAPIPLE presents good psychometric properties that justify its use for elderly people. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.

  17. EUV microexposures at the ALS using the 0.3-NA MET projectionoptics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naulleau, Patrick; Goldberg, Kenneth A.; Anderson, Erik

    2005-09-01

    The recent development of high numerical aperture (NA) EUV optics such as the 0.3-NA Micro Exposure Tool (MET) optic has given rise to a new class of ultra-high resolution microexposure stations. Once such printing station has been developed and implemented at Lawrence Berkeley National Laboratory's Advanced Light Source. This flexible printing station utilizes a programmable coherence illuminator providing real-time pupil-fill control for advanced EUV resist and mask development. The Berkeley exposure system programmable illuminator enables several unique capabilities. Using dipole illumination out to {sigma}=1, the Berkeley tool supports equal-line-space printing down to 12 nm, well beyond the capabilities of similarmore » tools. Using small-sigma illumination combined with the central obscuration of the MET optic enables the system to print feature sizes that are twice as small as those coded on the mask. In this configuration, the effective 10x-demagnification for equal lines and spaces reduces the mask fabrication burden for ultra-high-resolution printing. The illuminator facilitates coherence studies such as the impact of coherence on line-edge roughness (LER) and flare. Finally the illuminator enables novel print-based aberration monitoring techniques as described elsewhere in these proceedings. Here we describe the capabilities of the new MET printing station and present system characterization results. Moreover, we present the latest printing results obtained in experimental resists. Limited by the availability of high-resolution photoresists, equal line-space printing down to 25 nm has been demonstrated as well as isolated line printing down to 29 nm with an LER of approaching 3 nm.« less

  18. Total-body photography in skin cancer screening: the clinical utility of standardized imaging.

    PubMed

    Rosenberg, Alexandra; Meyerle, Jon H

    2017-05-01

    Early detection of skin cancer is essential to reducing morbidity and mortality from both melanoma and nonmelanoma skin cancers. Total-body skin examinations (TBSEs) may improve early detection of malignant melanomas (MMs) but are controversial due to the poor quality of data available to establish a mortality benefit from skin cancer screening. Total-body photography (TBP) promises to provide a way forward by lowering the costs of dermatologic screening while simultaneously leveraging technology to increase patient access to dermatologic care. Standardized TBP also offers the ability for dermatologists to work synergistically with modern computer technology involving algorithms capable of analyzing high-quality images to flag concerning lesions that may require closer evaluation. On a population level, inexpensive TBP has the potential to increase access to skin cancer screening and it has several specific applications in a military population. The utility of standardized TBP is reviewed in the context of skin cancer screening and teledermatology.

  19. Macromolecular target prediction by self-organizing feature maps.

    PubMed

    Schneider, Gisbert; Schneider, Petra

    2017-03-01

    Rational drug discovery would greatly benefit from a more nuanced appreciation of the activity of pharmacologically active compounds against a diverse panel of macromolecular targets. Already, computational target-prediction models assist medicinal chemists in library screening, de novo molecular design, optimization of active chemical agents, drug re-purposing, in the spotting of potential undesired off-target activities, and in the 'de-orphaning' of phenotypic screening hits. The self-organizing map (SOM) algorithm has been employed successfully for these and other purposes. Areas covered: The authors recapitulate contemporary artificial neural network methods for macromolecular target prediction, and present the basic SOM algorithm at a conceptual level. Specifically, they highlight consensus target-scoring by the employment of multiple SOMs, and discuss the opportunities and limitations of this technique. Expert opinion: Self-organizing feature maps represent a straightforward approach to ligand clustering and classification. Some of the appeal lies in their conceptual simplicity and broad applicability domain. Despite known algorithmic shortcomings, this computational target prediction concept has been proven to work in prospective settings with high success rates. It represents a prototypic technique for future advances in the in silico identification of the modes of action and macromolecular targets of bioactive molecules.

  20. Proposal for An Algorithm for Screening for Undernutrition in Hospitalized Children.

    PubMed

    Huysentruyt, Koen; De Schepper, Jean; Bontems, Patrick; Alliet, Philippe; Peeters, Ellen; Roelants, Mathieu; Van Biervliet, Stephanie; Hauser, Bruno; Vandenplas, Yvan

    2016-11-01

    The prevalence of disease-related undernutrition in hospitalized children has not decreased significantly in the last decades in Europe. A recent large multicentric European study reported a percentage of underweight children ranging across countries from 4.0% to 9.3%. Nutritional screening has been put forward as a strategy to detect and prevent undernutrition in hospitalized children. It allows timely implementation of adequate nutritional support and prevents further nutritional deterioration of hospitalized children. In this article, a hands-on practical guideline for the implementation of a nutritional care program in hospitalized children is provided. The difference between nutritional status (anthropometry with or without additional technical investigations) at admission and nutritional risk (the risk of the need for a nutritional intervention or the risk for nutritional deterioration during hospital stay) is the focus of this article. Based on the quality control circle principle of Deming, a nutritional care algorithm, with detailed instructions specific for the pediatric population was developed and implementation in daily practice is proposed. Further research is required to prove the applicability and the merit of this algorithm. It can, however, serve as a basis to provide European or even wider guidelines.

  1. Potential of cancer screening with serum surface-enhanced Raman spectroscopy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Li, S. X.; Zhang, Y. J.; Zeng, Q. Y.; Li, L. F.; Guo, Z. Y.; Liu, Z. M.; Xiong, H. L.; Liu, S. H.

    2014-06-01

    Cancer is the most common disease to threaten human health. The ability to screen individuals with malignant tumours with only a blood sample would be greatly advantageous to early diagnosis and intervention. This study explores the possibility of discriminating between cancer patients and normal subjects with serum surface-enhanced Raman spectroscopy (SERS) and a support vector machine (SVM) through a peripheral blood sample. A total of 130 blood samples were obtained from patients with liver cancer, colonic cancer, esophageal cancer, nasopharyngeal cancer, gastric cancer, as well as 113 blood samples from normal volunteers. Several diagnostic models were built with the serum SERS spectra using SVM and principal component analysis (PCA) techniques. The results show that a diagnostic accuracy of 85.5% is acquired with a PCA algorithm, while a diagnostic accuracy of 95.8% is obtained using radial basis function (RBF), PCA-SVM methods. The results prove that a RBF kernel PCA-SVM technique is superior to PCA and conventional SVM (C-SVM) algorithms in classification serum SERS spectra. The study demonstrates that serum SERS, in combination with SVM techniques, has great potential for screening cancerous patients with any solid malignant tumour through a peripheral blood sample.

  2. Emerging from the database shadows: characterizing undocumented immigrants in a large cohort of HIV-infected persons.

    PubMed

    Ross, Jonathan; Hanna, David B; Felsen, Uriel R; Cunningham, Chinazo O; Patel, Viraj V

    2017-12-01

    Little is known about how HIV affects undocumented immigrants despite social and structural factors that may place them at risk of poor HIV outcomes. Our understanding of the clinical epidemiology of HIV-infected undocumented immigrants is limited by the challenges of determining undocumented immigration status in large data sets. We developed an algorithm to predict undocumented status using social security number (SSN) and insurance data. We retrospectively applied this algorithm to a cohort of HIV-infected adults receiving care at a large urban healthcare system who attended at least one HIV-related outpatient visit from 1997 to 2013, classifying patients as "screened undocumented" or "documented". We then reviewed the medical records of screened undocumented patients, classifying those whose records contained evidence of undocumented status as "undocumented per medical chart" (charted undocumented). Bivariate measures of association were used to identify demographic and clinical characteristics associated with undocumented immigrant status. Of 7593 patients, 205 (2.7%) were classified as undocumented by the algorithm. Compared to documented patients, undocumented patients were younger at entry to care (mean 38.5 years vs. 40.6 years, p < 0.05), less likely to be female (33.2% vs. 43.1%, p < 0.01), less likely to report injection drug use as their primary HIV risk factor (3.4% vs. 18.0%, p < 0.001), and had lower median CD4 count at entry to care (288 vs. 339 cells/mm 3 , p < 0.01). After medical record review, we re-classified 104 patients (50.7%) as charted undocumented. Demographic and clinical characteristics of charted undocumented did not differ substantially from screened undocumented. Our algorithm allowed us to identify and clinically characterize undocumented immigrants within an HIV-infected population, though it overestimated the prevalence of patients who were undocumented.

  3. Implementing Adolescent Screening, Brief Intervention, and Referral to Treatment (SBIRT) Education in a Pediatric Residency Curriculum.

    PubMed

    Schram, Patricia; Harris, Sion K; Van Hook, Shari; Forman, Sara; Mezzacappa, Enrico; Pavlyuk, Roman; Levy, Sharon

    2015-01-01

    Screening, brief intervention, and referral to treatment (SBIRT) is recommended as part of routine health care for adolescents as well as adults. In an effort to promote universal SBIRT, the Substance Abuse and Mental Health Services Administration awarded funding to residency programs to develop and implement SBIRT education and training. Our project focused on creating scientifically based, developmentally appropriate strategies and teaching materials for the adolescent age range. This paper describes curriculum development and implementation and presents evaluation data. Pediatric and child psychiatry residents were trained. The training consisted of 4 activities: (1) case-based teaching modules, (2) role-play of motivational interviewing and brief interventions, (3) mock interviews with trained adolescents, and (4) supervised "hands-on" screening and brief interventions. Main outcome measures included trainee satisfaction, and SBIRT knowledge, perceived self-efficacy, and self- and observer report of use of the SBIRT algorithm. Among 150 total participants completing the SBIRT training modules, nearly all (92.3%) were satisfied/very satisfied with the training modules. Knowledge accuracy immediately post training was high, but declined significantly by the end of the first residency year, with little change across subsequent years of residency. Confidence ratings also declined over time. Use of the SBIRT algorithm during the Adolescent Medicine rotation was high according to trainee self- and faculty observer report. We found evidence of training satisfaction, increased confidence in talking to adolescents about substance use, and widespread use of recommended practices immediately following training. Use of a highly structured algorithm to guide practice, and simple, highly structured brief interventions was a successful training approach, as residents self-reported accurate use of the SBIRT algorithm immediately after training. Knowledge and self-confidence declined over time. It is possible that "booster" sessions and ongoing opportunities to review materials could help residents retain knowledge and skills.

  4. Measurement of fecal elastase improves performance of newborn screening for cystic fibrosis.

    PubMed

    Barben, Juerg; Rueegg, Corina S; Jurca, Maja; Spalinger, Johannes; Kuehni, Claudia E

    2016-05-01

    The aim of newborn screening (NBS) for CF is to detect children with 'classic' CF where early treatment is possible and improves prognosis. Children with inconclusive CF diagnosis (CFSPID) should not be detected, as there is no evidence for improvement through early treatment. No algorithm in current NBS guidelines explains what to do when sweat test (ST) fails. This study compares the performance of three different algorithms for further diagnostic evaluations when first ST is unsuccessful, regarding the numbers of children detected with CF and CFSPID, and the time until a definite diagnosis. In Switzerland, CF-NBS was introduced in January 2011 using an IRT-DNA-IRT algorithm followed by a ST. In children, in whom ST was not possible (no or insufficient sweat), 3 different protocols were applied between 2011 and 2014: in 2011, ST was repeated until it was successful (protocol A), in 2012 we proceeded directly to diagnostic DNA testing (protocol B), and 2013-2014, fecal elastase (FE) was measured in the stool, in order to determine a pancreas insufficiency needing immediate treatment (protocol C). The ratio CF:CFSPID was 7:1 (27/4) with protocol A, 2:1 (22/10) with protocol B, and 14:1 (54/4) with protocol C. The mean time to definite diagnosis was significantly shorter with protocol C (33days) compared to protocol A or B (42 and 40days; p=0.014 compared to A, and p=0.036 compared to B). The algorithm for the diagnostic part of the newborn screening used in the CF centers is important and affects the performance of a CF-NBS program with regard to the ratio CF:CFSPID and the time until definite diagnosis. Our results suggest to include FE after initial sweat test failure in the CF-NBS guidelines to keep the proportion of CFSPID low and the time until definite diagnosis short. Copyright © 2016 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  5. Screening for Pancreatic Cancer

    PubMed Central

    Brand, Randall E.

    2007-01-01

    Despite improvements in the clinical and surgical management of pancreatic cancer, limited strides have been made in the early detection of this highly lethal malignancy. The majority of localized pancreatic tumors are asymptomatic, and the recognized presenting symptoms of pancreatic adenocarcinoma are often vague and heterogeneous in nature. These factors, coupled with the lack of a sensitive and noninvasive screening method, have made population-based screening for pancreatic cancer impossible. Nevertheless, at least two large institutions have performed multimodality-screening protocols for individuals with high risk of pancreatic cancer based on genetic predisposition and strong family history. Abnormalities noted during these screening protocols prompted further investigation or surgery that resulted in the discovery of benign, potentially malignant, and malignant pancreatic lesions. In addition to ductal epithelial pancreatic intraepithelial neoplasia, greater sensitivity has recently been achieved in the identification and characterization of precancerous mucinous pancreatic tumors. Advancements in proteomics and DNA microarray technology may confirm serum-based biomarkers that could be incorporated into future screening algorithms for pancreatic cancer. PMID:21960811

  6. COMDECOM: predicting the lifetime of screening compounds in DMSO solution.

    PubMed

    Zitha-Bovens, Emrin; Maas, Peter; Wife, Dick; Tijhuis, Johan; Hu, Qian-Nan; Kleinöder, Thomas; Gasteiger, Johann

    2009-06-01

    The technological evolution of the 1990s in both combinatorial chemistry and high-throughput screening created the demand for rapid access to the compound deck to support the screening process. The common strategy within the pharmaceutical industry is to store the screening library in DMSO solution. Several studies have shown that a percentage of these compounds decompose in solution, varying from a few percent of the total to a substantial part of the library. In the COMDECOM (COMpound DECOMposition) project, the compound stability of screening compounds in DMSO solution is monitored in an accelerated thermal, hydrolytic, and oxidative decomposition program. A large database with stability data is collected, and from this database, a predictive model is being developed. The aim of this program is to build an algorithm that can flag compounds that are likely to decompose-information that is considered to be of utmost importance (e.g., in the compound acquisition process and when evaluation screening results of library compounds, as well as in the determination of optimal storage conditions).

  7. Performance of the Alere Determine™ HIV-1/2 Ag/Ab Combo Rapid Test with algorithm-defined acute HIV-1 infection specimens.

    PubMed

    Parker, Monica M; Bennett, S Berry; Sullivan, Timothy J; Fordan, Sally; Wesolowski, Laura G; Wroblewski, Kelly; Gaynor, Anne M

    2018-05-14

    The capacity of HIV Antigen/Antibody (Ag/Ab) immunoassays (IA) to detect HIV-1 p24 antigen has resulted in improved detection of HIV-1 infections in comparison to Ab-only screening assays. Since its introduction in the US, studies have shown that the Determine HIV-1/2 Ag/Ab Combo assay (Determine Ag/Ab) detects HIV infection earlier than laboratory-based IgM/IgG-sensitive IAs, but its sensitivity for HIV-1 p24 Ag detection is reduced compared to laboratory-based Ag/Ab assays. However, further evaluation is needed to assess its capacity to detect acute HIV-1 infection. To assess the performance of Determine Ag/Ab in serum from acute HIV-1 infections. Select serum specimens that screened reactive on a laboratory-based Ag/Ab IA or IgM/IgG Ab-only IA, with a negative or indeterminate supplemental antibody test and detectable HIV-1 RNA were retrospectively tested with Determine Ag/Ab. Results were compared with those of the primary screening immunoassay to evaluate concordance within this set of algorithm-defined acute infections. Of 159 algorithm-defined acute HIV-1 specimens, Determine Ag/Ab was reactive for 105 resulting in 66.0% concordance. Of 125 that were initially detected by a laboratory-based Ag/Ab IA, 81 (64.8%) were reactive by Determine Ag/Ab. A total of 34 acute specimens were initially detected by a laboratory-based IgM/IgG Ab-only IA and 24 (70.6%) of those were reactive by Determine Ag/Ab. Due to their enhanced sensitivity, laboratory-based Ag/Ab IAs continue to be preferred over the Determine Ag/Ab as the screening method used by laboratories conducting HIV diagnostic testing on serum and plasma specimens. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  8. In Silico Screening Based on Predictive Algorithms as a Design Tool for Exon Skipping Oligonucleotides in Duchenne Muscular Dystrophy

    PubMed Central

    Echigoya, Yusuke; Mouly, Vincent; Garcia, Luis; Yokota, Toshifumi; Duddy, William

    2015-01-01

    The use of antisense ‘splice-switching’ oligonucleotides to induce exon skipping represents a potential therapeutic approach to various human genetic diseases. It has achieved greatest maturity in exon skipping of the dystrophin transcript in Duchenne muscular dystrophy (DMD), for which several clinical trials are completed or ongoing, and a large body of data exists describing tested oligonucleotides and their efficacy. The rational design of an exon skipping oligonucleotide involves the choice of an antisense sequence, usually between 15 and 32 nucleotides, targeting the exon that is to be skipped. Although parameters describing the target site can be computationally estimated and several have been identified to correlate with efficacy, methods to predict efficacy are limited. Here, an in silico pre-screening approach is proposed, based on predictive statistical modelling. Previous DMD data were compiled together and, for each oligonucleotide, some 60 descriptors were considered. Statistical modelling approaches were applied to derive algorithms that predict exon skipping for a given target site. We confirmed (1) the binding energetics of the oligonucleotide to the RNA, and (2) the distance in bases of the target site from the splice acceptor site, as the two most predictive parameters, and we included these and several other parameters (while discounting many) into an in silico screening process, based on their capacity to predict high or low efficacy in either phosphorodiamidate morpholino oligomers (89% correctly predicted) and/or 2’O Methyl RNA oligonucleotides (76% correctly predicted). Predictions correlated strongly with in vitro testing for sixteen de novo PMO sequences targeting various positions on DMD exons 44 (R2 0.89) and 53 (R2 0.89), one of which represents a potential novel candidate for clinical trials. We provide these algorithms together with a computational tool that facilitates screening to predict exon skipping efficacy at each position of a target exon. PMID:25816009

  9. Computer-aided diagnosis workstation and teleradiology network system for chest diagnosis using the web medical image conference system with a new information security solution

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki

    2010-03-01

    Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.

  10. Training set optimization and classifier performance in a top-down diabetic retinopathy screening system

    NASA Astrophysics Data System (ADS)

    Wigdahl, J.; Agurto, C.; Murray, V.; Barriga, S.; Soliz, P.

    2013-03-01

    Diabetic retinopathy (DR) affects more than 4.4 million Americans age 40 and over. Automatic screening for DR has shown to be an efficient and cost-effective way to lower the burden on the healthcare system, by triaging diabetic patients and ensuring timely care for those presenting with DR. Several supervised algorithms have been developed to detect pathologies related to DR, but little work has been done in determining the size of the training set that optimizes an algorithm's performance. In this paper we analyze the effect of the training sample size on the performance of a top-down DR screening algorithm for different types of statistical classifiers. Results are based on partial least squares (PLS), support vector machines (SVM), k-nearest neighbor (kNN), and Naïve Bayes classifiers. Our dataset consisted of digital retinal images collected from a total of 745 cases (595 controls, 150 with DR). We varied the number of normal controls in the training set, while keeping the number of DR samples constant, and repeated the procedure 10 times using randomized training sets to avoid bias. Results show increasing performance in terms of area under the ROC curve (AUC) when the number of DR subjects in the training set increased, with similar trends for each of the classifiers. Of these, PLS and k-NN had the highest average AUC. Lower standard deviation and a flattening of the AUC curve gives evidence that there is a limit to the learning ability of the classifiers and an optimal number of cases to train on.

  11. The Applications of Genetic Algorithms in Medicine.

    PubMed

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-11-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.].

  12. The Applications of Genetic Algorithms in Medicine

    PubMed Central

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-01-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.] PMID:26676060

  13. Application of furniture images selection based on neural network

    NASA Astrophysics Data System (ADS)

    Wang, Yong; Gao, Wenwen; Wang, Ying

    2018-05-01

    In the construction of 2 million furniture image databases, aiming at the problem of low quality of database, a combination of CNN and Metric learning algorithm is proposed, which makes it possible to quickly and accurately remove duplicate and irrelevant samples in the furniture image database. Solve problems that images screening method is complex, the accuracy is not high, time-consuming is long. Deep learning algorithm achieve excellent image matching ability in actual furniture retrieval applications after improving data quality.

  14. Self-Organizing Maps for In Silico Screening and Data Visualization.

    PubMed

    Digles, Daniela; Ecker, Gerhard F

    2011-10-01

    Self-organizing maps, which are unsupervised artificial neural networks, have become a very useful tool in a wide area of disciplines, including medicinal chemistry. Here, we will focus on two applications of self-organizing maps: the use of self-organizing maps for in silico screening and for clustering and visualisation of large datasets. Additionally, the importance of parameter selection is discussed and some modifications to the original algorithm are summarised. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Colorectal cancer screening: An updated review of the available options.

    PubMed

    Issa, Iyad A; Noureddine, Malak

    2017-07-28

    Colorectal cancer (CRC) is a significant cause of morbidity and mortality worldwide. However, colon cancer incidence and mortality is declining over the past decade owing to adoption of effective screening programs. Nevertheless, in some parts of the world, CRC incidence and mortality remain on the rise, likely due to factors including "westernized" diet, lifestyle, and lack of health-care infrastructure and resources. Participation and adherence to different national screening programs remain obstacles limiting the achievement of screening goals. Different modalities are available ranging from stool based tests to radiology and endoscopy with varying sensitivity and specificity. However, the availability of these tests is limited to areas with high economic resources. Recently, FDA approved a blood-based test (Epi procolon ® ) for CRC screening. This blood based test may serve to increase the participation and adherence rates. Hence, leading to increase in colon cancer detection and prevention. This article will discuss various CRC screening tests with a particular focus on the data regarding the new approved blood test. Finally, we will propose an algorithm for a simple cost-effective CRC screening program.

  16. The clinical implementation of primary HPV screening.

    PubMed

    Mariani, Luciano; Igidbashian, Sarah; Sandri, Maria Teresa; Vici, Patrizia; Landoni, Fabio

    2017-03-01

    To evaluate, from a gynecology perspective, the transition from cytology-based HPV screening to primary HPV screening. Studies examining switching from cytology-based screening to primary HPV-DNA testing with triaging of patients with positive test results were retrieved and reviewed, with a particular focus on screening in an Italian setting. The increased complexity of patient-management decisions when implementing HPV-based screening was a critical issue discussed in the literature. The change in strategy represents a paradigm shift in moving from a medical perspective of identifying the disease in individual patients, to a public-healthcare perspective of excluding HPV from the healthy population and identifying a small sub-group of individuals at increased risk of HPV. With knowledge about HPV screening evolving rapidly, new programs and related algorithms need to be sufficiently flexible to be adjusted according to ongoing research and the validation of new assays. The establishment of a national working group (including epidemiologists, gynecologists, pathologists, and healthcare providers) will be necessary to properly implement and govern this important technical and cultural transition. © 2016 International Federation of Gynecology and Obstetrics.

  17. Colorectal cancer screening: An updated review of the available options

    PubMed Central

    Issa, Iyad A; Noureddine, Malak

    2017-01-01

    Colorectal cancer (CRC) is a significant cause of morbidity and mortality worldwide. However, colon cancer incidence and mortality is declining over the past decade owing to adoption of effective screening programs. Nevertheless, in some parts of the world, CRC incidence and mortality remain on the rise, likely due to factors including “westernized” diet, lifestyle, and lack of health-care infrastructure and resources. Participation and adherence to different national screening programs remain obstacles limiting the achievement of screening goals. Different modalities are available ranging from stool based tests to radiology and endoscopy with varying sensitivity and specificity. However, the availability of these tests is limited to areas with high economic resources. Recently, FDA approved a blood-based test (Epi procolon®) for CRC screening. This blood based test may serve to increase the participation and adherence rates. Hence, leading to increase in colon cancer detection and prevention. This article will discuss various CRC screening tests with a particular focus on the data regarding the new approved blood test. Finally, we will propose an algorithm for a simple cost-effective CRC screening program. PMID:28811705

  18. iPhone ECG screening by practice nurses and receptionists for atrial fibrillation in general practice: the GP-SEARCH qualitative pilot study.

    PubMed

    Orchard, Jessica; Freedman, Saul Benedict; Lowres, Nicole; Peiris, David; Neubeck, Lis

    2014-05-01

    Atrial fibrillation (AF) is often asymptomatic and substantially increases stroke risk. A single-lead iPhone electrocardiograph (iECG) with a validated AF algorithm could make systematic AF screening feasible in general practice. A qualitative screening pilot study was conducted in three practices. Receptionists and practice nurses screened patients aged ≥65 years using an iECG (transmitted to a secure website) and general practitioner (GP) review was then provided during the patient's consultation. Fourteen semi-structured interviews with GPs, nurses, receptionists and patients were audio-recorded, transcribed and analysed thematically. Eighty-eight patients (51% male; mean age 74.8 ± 8.8 years) were screened: 17 patients (19%) were in AF (all previously diagnosed). The iECG was well accepted by GPs, nurses and patients. Receptionists were reluctant, whereas nurses were confident in using the device, explaining and providing screening. AF screening in general practice is feasible. A promising model is likely to be one delivered by a practice nurse, but depends on relevant contextual factors for each practice.

  19. Flexible ligand docking using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Oshiro, C. M.; Kuntz, I. D.; Dixon, J. Scott

    1995-04-01

    Two computational techniques have been developed to explore the orientational and conformational space of a flexible ligand within an enzyme. Both methods use the Genetic Algorithm (GA) to generate conformationally flexible ligands in conjunction with algorithms from the DOCK suite of programs to characterize the receptor site. The methods are applied to three enzyme-ligand complexes: dihydrofolate reductase-methotrexate, thymidylate synthase-phenolpthalein and HIV protease-thioketal haloperidol. Conformations and orientations close to the crystallographically determined structures are obtained, as well as alternative structures with low energy. The potential for the GA method to screen a database of compounds is also examined. A collection of ligands is evaluated simultaneously, rather than docking the ligands individually into the enzyme.

  20. WAM: an improved algorithm for modelling antibodies on the WEB.

    PubMed

    Whitelegg, N R; Rees, A R

    2000-12-01

    An improved antibody modelling algorithm has been developed which incorporates significant improvements to the earlier versions developed by Martin et al. (1989, 1991), Pedersen et al. (1992) and Rees et al. (1996) and known as AbM (Oxford Molecular). The new algorithm, WAM (for Web Antibody Modelling), has been launched as an online modelling service and is located at URL http://antibody.bath.ac.uk. Here we provide a summary only of the important features of WAM. Readers interested in further details are directed to the website, which gives extensive background information on the methods employed. A brief description of the rationale behind some of the newer methodology (specifically, the knowledge-based screens) is also given.

  1. Plate-based diversity subset screening generation 2: an improved paradigm for high-throughput screening of large compound files.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Loesel, Jens; McLoughlin, David; Mills, James; Peakman, Marie-Claire; Sharp, Robert E; Williams, Christine; Zhu, Hongyao

    2016-11-01

    High-throughput screening (HTS) is an effective method for lead and probe discovery that is widely used in industry and academia to identify novel chemical matter and to initiate the drug discovery process. However, HTS can be time consuming and costly and the use of subsets as an efficient alternative to screening entire compound collections has been investigated. Subsets may be selected on the basis of chemical diversity, molecular properties, biological activity diversity or biological target focus. Previously, we described a novel form of subset screening: plate-based diversity subset (PBDS) screening, in which the screening subset is constructed by plate selection (rather than individual compound cherry-picking), using algorithms that select for compound quality and chemical diversity on a plate basis. In this paper, we describe a second-generation approach to the construction of an updated subset: PBDS2, using both plate and individual compound selection, that has an improved coverage of the chemical space of the screening file, whilst only selecting the same number of plates for screening. We describe the validation of PBDS2 and its successful use in hit and lead discovery. PBDS2 screening became the default mode of singleton (one compound per well) HTS for lead discovery in Pfizer.

  2. Recent trends in digital halftoning

    NASA Astrophysics Data System (ADS)

    Delabastita, Paul A.

    1997-02-01

    Screening is perhaps the oldest form of image processing. The word refers to the mechanical cross line screens that were used at the beginning of this century for the purpose of photomechanical reproduction. Later on, these mechanical screens were replaced by photographic contact screens that enabled significantly improved process control. In the early eighties, the optical screening on graphic arts scanners was replaced by a combination of laser optics and electronic screening. The algorithms, however, were still digital implementations of the original optical methods. The printing needs in the fast growing computer and software industry gave birth to a number of alternative printing technologies such as electrophotographic and inkjet printing. Originally these deices were only designed for printing text, but soon people started experimenting and using them for printing images. The relatively low spatial resolutions of these new devices however made complete review of 'the screening issue' necessary to achieve an acceptable image quality. In this paper a number of recent developments in screening technology are summarized. Special attention is given to the interaction that exists between a halftone screen and the printing devices on which they are rendered including the color mixing behavior. Improved screening techniques are presented that take advantage of modeling the physical behavior of the rendering device.

  3. Lightweight Towed Howitzer Demonstrator. Phase 1 and Partial Phase 2. Volume D1. Part 1. Structural Analysis (Less Cradle and System).

    DTIC Science & Technology

    1987-04-01

    X4Cro 7WAW,"G9 N k Y,1i ACCA . 2 JVM ly y,- L OA.0 v roo*Ler t-OAD. FA’ A /"Fi./A/a. (" -A,E’ CNAiv*6:CP A/ ’r/O’,L C4A$4P&4R.* -1MAGC.T?D0 VI.3 Moon...M- S 3D = .. = ==-=== Version 1.1 02/01/85 LTHD muzzle brate thermal load - steel Load Case 1: Loads Node F x FFM - My

  4. Planetary spacecraft cost modeling utilizing labor estimating relationships

    NASA Technical Reports Server (NTRS)

    Williams, Raymond

    1990-01-01

    A basic computerized technology is presented for estimating labor hours and cost of unmanned planetary and lunar programs. The user friendly methodology designated Labor Estimating Relationship/Cost Estimating Relationship (LERCER) organizes the forecasting process according to vehicle subsystem levels. The level of input variables required by the model in predicting cost is consistent with pre-Phase A type mission analysis. Twenty one program categories were used in the modeling. To develop the model, numerous LER and CER studies were surveyed and modified when required. The result of the research along with components of the LERCER program are reported.

  5. Versatile reactivities of rare-earth metal dialkyl complexes supported by a neutral pyrrolyl-functionalized β-diketiminato ligand.

    PubMed

    Zhu, Xiancui; Li, Yang; Guo, Dianjun; Wang, Shaowu; Wei, Yun; Zhou, Shuangliu

    2018-03-12

    Herein, rare-earth metal dialkyl complexes supported by a neutral pyrrolyl-functionalized β-diketiminato ligand with the formula LRE(CH 2 SiMe 3 ) 2 (thf) (RE = Y (1a), Dy (1b), Er (1c), Yb (1d); L = MeC(NDipp)CHC(Me)NCH 2 CH 2 NC 4 H 2 -2,5-Me 2 , Dipp = 2,6- i Pr 2 C 6 H 3 ) were synthesized via the reactions of the β-diketimine HL with the rare-earth metal trialkyl complexes RE(CH 2 SiMe 3 ) 3 (thf) 2 in high yields. The reactivities of 1 with pyridine derivatives, unsaturated substrates, and elemental sulfur were investigated, and some interesting chemical transformations were observed. Ligand exchange and activation of sp 2 and sp 3 C-H bonds occurred during the reactions with pyridine derivatives to afford different types of mononuclear rare-earth metal pyridyl complexes, namely, LEr(CH 2 SiMe 3 ) 2 (η 1 -NC 5 H 4 ) (2c), LRE(η 3 -CH 2 -2-NC 5 H 2 -4,6-Me 2 ) 2 (RE = Y (3a), Er (3c)), and LRE(CH 2 SiMe 3 )(η 2 -(C,N)-2-(2-C 6 H 4 NC 5 H 4 )) (RE = Er (4c), Yb = (4d)). Similarly, activation of the sp C-H bond occurred during the reaction of phenylacetylene with 1c to produce the dinuclear erbium alkynyl complex [LEr(CH 2 SiMe 3 )(μ-C[triple bond, length as m-dash]CPh)] 2 (5c). The mixed amidinate-β-diketiminato ytterbium complex LYb[(Dipp)NC(CH 2 SiMe 3 )N(Dipp)](CH 2 SiMe 3 ) (6d) was obtained by the insertion of bis(2,6-diisopropylphenyl)carbodiimide into a Yb-alkyl bond, as well as via the direct alkane elimination of a CH 2 SiMe 3 moiety with bis(2,6-diisopropylphenyl)formamidine to afford the erbium complex LEr(DippNCHNDipp)(CH 2 SiMe 3 ) (7c). A rare sp 2 C-H bond oxidation of the β-diketiminato backbone with elemental sulfur insertion was detected to provide the unprecedented dinuclear rare-earth metal thiolate complexes (LRE) 2 (μ-SCH 2 SiMe 3 ) 2 (μ-SCC(Me)(NDipp)C(Me)NCH 2 CH 2 NC 4 H 2 Me 2 -2,5) (RE = Y (8a), Er (8c)) in the reactions of S 8 with 1a and 1c, respectively. The molecular structures of the complexes 1-8 were determined by single-crystal X-ray diffraction analyses.

  6. Dysferlin quantification in monocytes for rapid screening for dysferlinopathies.

    PubMed

    Sánchez-Chapul, Laura; Ángel-Muñoz, Miguel Del; Ruano-Calderón, Luis; Luna-Angulo, Alexandra; Coral-Vázquez, Ramón; Hernández-Hernández, Óscar; Magaña, Jonathan J; León-Hernández, Saúl R; Escobar-Cedillo, Rosa E; Vargas, Steven

    2016-12-01

    In this study, we determined normal levels of dysferlin expression in CD14 + monocytes by flow cytometry (FC) as a screening tool for dysferlinopathies. Monocytes from 183 healthy individuals and 29 patients were immunolabeled, run on an FACScalibur flow cytometer, and analyzed by FlowJo software. The relative quantity of dysferlin was expressed as mean fluorescence intensity (MFI). Performance of this diagnostic test was assessed by calculating likelihood ratios at different MFI cut-off points, which allowed definition of 4 disease classification groups in a simplified algorithm. The MFI value may differentiate patients with dysferlinopathy from healthy individuals; it may be a useful marker for screening purposes. Muscle Nerve 54: 1064-1071, 2016. © 2016 Wiley Periodicals, Inc.

  7. Automated image quality evaluation of T2 -weighted liver MRI utilizing deep learning architecture.

    PubMed

    Esses, Steven J; Lu, Xiaoguang; Zhao, Tiejun; Shanbhogue, Krishna; Dane, Bari; Bruno, Mary; Chandarana, Hersh

    2018-03-01

    To develop and test a deep learning approach named Convolutional Neural Network (CNN) for automated screening of T 2 -weighted (T 2 WI) liver acquisitions for nondiagnostic images, and compare this automated approach to evaluation by two radiologists. We evaluated 522 liver magnetic resonance imaging (MRI) exams performed at 1.5T and 3T at our institution between November 2014 and May 2016 for CNN training and validation. The CNN consisted of an input layer, convolutional layer, fully connected layer, and output layer. 351 T 2 WI were anonymized for training. Each case was annotated with a label of being diagnostic or nondiagnostic for detecting lesions and assessing liver morphology. Another independently collected 171 cases were sequestered for a blind test. These 171 T 2 WI were assessed independently by two radiologists and annotated as being diagnostic or nondiagnostic. These 171 T 2 WI were presented to the CNN algorithm and image quality (IQ) output of the algorithm was compared to that of two radiologists. There was concordance in IQ label between Reader 1 and CNN in 79% of cases and between Reader 2 and CNN in 73%. The sensitivity and the specificity of the CNN algorithm in identifying nondiagnostic IQ was 67% and 81% with respect to Reader 1 and 47% and 80% with respect to Reader 2. The negative predictive value of the algorithm for identifying nondiagnostic IQ was 94% and 86% (relative to Readers 1 and 2). We demonstrate a CNN algorithm that yields a high negative predictive value when screening for nondiagnostic T 2 WI of the liver. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:723-728. © 2017 International Society for Magnetic Resonance in Medicine.

  8. Application of a fall screening algorithm stratified fall risk but missed preventive opportunities in community-dwelling older adults: a prospective study.

    PubMed

    Muir, Susan W; Berg, Katherine; Chesworth, Bert; Klar, Neil; Speechley, Mark

    2010-01-01

    Evaluate the ability of the American and British Geriatrics Society fall prevention guideline's screening algorithm to identify and stratify future fall risk in community-dwelling older adults. Prospective cohort of community-dwelling older adults (n = 117) aged 65 to 90 years. Fall history, balance, and gait measured during a comprehensive geriatric assessment at baseline. Falls data were collected monthly for 1 year. The outcomes of any fall and any injurious fall were evaluated. The algorithm stratified participants into 4 hierarchal risk categories. Fall risk was 33% and 68% for the "no intervention" and "comprehensive fall evaluation required" groups respectively. The relative risk estimate for falling comparing participants in the 2 intervention groups was 2.08 (95% CI 1.42-3.05) for any fall and 2.60 (95% Cl 1.53-4.42) for any injurious fall. Prognostic accuracy values were: sensitivity of 0.50 (95% Cl 0.36-0.64) and specificity of 0.82 (95% CI 0.70-0.90) for any fall; and sensitivity of 0.56 (95% CI 0.38-0.72) and specificity of 0.78 (95% Cl 0.67-0.86) for any injurious fall. The algorithm was able to identify and stratify fall risk for each fall outcome, though the values of prognostic accuracy demonstrate moderate clinical utility. The recommendations of fall evaluation for individuals in the highest risk groups appear supported though the recommendation of no intervention in the lowest risk groups may not address their needs for fall prevention interventions. Further evaluation of the algorithm is recommended to refine the identification of fall risk in community-dwelling older adults.

  9. GOSAT CO2 retrieval results using TANSO-CAI aerosol information over East Asia

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, W.; Jung, Y.; Lee, S.; Kim, J.; Lee, H.; Boesch, H.; Goo, T. Y.

    2015-12-01

    In the satellite remote sensing of CO2, incorrect aerosol information could induce large errors as previous studies suggested. Many factors, such as, aerosol type, wavelength dependency of AOD, aerosol polarization effect and etc. have been main error sources. Due to these aerosol effects, large number of data retrieved are screened out in quality control, or retrieval errors tend to increase if not screened out, especially in East Asia where aerosol concentrations are fairly high. To reduce these aerosol induced errors, a CO2 retrieval algorithm using the simultaneous TANSO-CAI aerosol information is developed. This algorithm adopts AOD and aerosol type information as a priori information from the CAI aerosol retrieval algorithm. The CO2 retrieval algorithm based on optimal estimation method and VLIDORT, a vector discrete ordinate radiative transfer model. The CO2 algorithm, developed with various state vectors to find accurate CO2 concentration, shows reasonable results when compared with other dataset. This study concentrates on the validation of retrieved results with the ground-based TCCON measurements in East Asia and the comparison with the previous retrieval from ACOS, NIES, and UoL. Although, the retrieved CO2 concentration is lower than previous results by ppm's, it shows similar trend and high correlation with previous results. Retrieved data and TCCON measurements data are compared at three stations of Tsukuba, Saga, Anmyeondo in East Asia, with the collocation criteria of ±2°in latitude/longitude and ±1 hours of GOSAT passing time. Compared results also show similar trend with good correlation. Based on the TCCON comparison results, bias correction equation is calculated and applied to the East Asia data.

  10. Prediction of anti-cancer drug response by kernelized multi-task learning.

    PubMed

    Tan, Mehmet

    2016-10-01

    Chemotherapy or targeted therapy are two of the main treatment options for many types of cancer. Due to the heterogeneous nature of cancer, the success of the therapeutic agents differs among patients. In this sense, determination of chemotherapeutic response of the malign cells is essential for establishing a personalized treatment protocol and designing new drugs. With the recent technological advances in producing large amounts of pharmacogenomic data, in silico methods have become important tools to achieve this aim. Data produced by using cancer cell lines provide a test bed for machine learning algorithms that try to predict the response of cancer cells to different agents. The potential use of these algorithms in drug discovery/repositioning and personalized treatments motivated us in this study to work on predicting drug response by exploiting the recent pharmacogenomic databases. We aim to improve the prediction of drug response of cancer cell lines. We propose to use a method that employs multi-task learning to improve learning by transfer, and kernels to extract non-linear relationships to predict drug response. The method outperforms three state-of-the-art algorithms on three anti-cancer drug screen datasets. We achieved a mean squared error of 3.305 and 0.501 on two different large scale screen data sets. On a recent challenge dataset, we obtained an error of 0.556. We report the methodological comparison results as well as the performance of the proposed algorithm on each single drug. The results show that the proposed method is a strong candidate to predict drug response of cancer cell lines in silico for pre-clinical studies. The source code of the algorithm and data used can be obtained from http://mtan.etu.edu.tr/Supplementary/kMTrace/. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Fast localization of optic disc and fovea in retinal images for eye disease screening

    NASA Astrophysics Data System (ADS)

    Yu, H.; Barriga, S.; Agurto, C.; Echegaray, S.; Pattichis, M.; Zamora, G.; Bauman, W.; Soliz, P.

    2011-03-01

    Optic disc (OD) and fovea locations are two important anatomical landmarks in automated analysis of retinal disease in color fundus photographs. This paper presents a new, fast, fully automatic optic disc and fovea localization algorithm developed for diabetic retinopathy (DR) screening. The optic disc localization methodology comprises of two steps. First, the OD location is identified using template matching and directional matched filter. To reduce false positives due to bright areas of pathology, we exploit vessel characteristics inside the optic disc. The location of the fovea is estimated as the point of lowest matched filter response within a search area determined by the optic disc location. Second, optic disc segmentation is performed. Based on the detected optic disc location, a fast hybrid level-set algorithm which combines the region information and edge gradient to drive the curve evolution is used to segment the optic disc boundary. Extensive evaluation was performed on 1200 images (Messidor) composed of 540 images of healthy retinas, 431 images with DR but no risk of macular edema (ME), and 229 images with DR and risk of ME. The OD location methodology obtained 98.3% success rate, while fovea location achieved 95% success rate. The average mean absolute distance (MAD) between the OD segmentation algorithm and "gold standard" is 10.5% of estimated OD radius. Qualitatively, 97% of the images achieved Excellent to Fair performance for OD segmentation. The segmentation algorithm performs well even on blurred images.

  12. Towards a Systematic Screening Tool for Quality Assurance and Semiautomatic Fraud Detection for Images in the Life Sciences.

    PubMed

    Koppers, Lars; Wormer, Holger; Ickstadt, Katja

    2017-08-01

    The quality and authenticity of images is essential for data presentation, especially in the life sciences. Questionable images may often be a first indicator for questionable results, too. Therefore, a tool that uses mathematical methods to detect suspicious images in large image archives can be a helpful instrument to improve quality assurance in publications. As a first step towards a systematic screening tool, especially for journal editors and other staff members who are responsible for quality assurance, such as laboratory supervisors, we propose a basic classification of image manipulation. Based on this classification, we developed and explored some simple algorithms to detect copied areas in images. Using an artificial image and two examples of previously published modified images, we apply quantitative methods such as pixel-wise comparison, a nearest neighbor and a variance algorithm to detect copied-and-pasted areas or duplicated images. We show that our algorithms are able to detect some simple types of image alteration, such as copying and pasting background areas. The variance algorithm detects not only identical, but also very similar areas that differ only by brightness. Further types could, in principle, be implemented in a standardized scanning routine. We detected the copied areas in a proven case of image manipulation in Germany and showed the similarity of two images in a retracted paper from the Kato labs, which has been widely discussed on sites such as pubpeer and retraction watch.

  13. Simulation of bright-field microscopy images depicting pap-smear specimen

    PubMed Central

    Malm, Patrik; Brun, Anders; Bengtsson, Ewert

    2015-01-01

    As digital imaging is becoming a fundamental part of medical and biomedical research, the demand for computer-based evaluation using advanced image analysis is becoming an integral part of many research projects. A common problem when developing new image analysis algorithms is the need of large datasets with ground truth on which the algorithms can be tested and optimized. Generating such datasets is often tedious and introduces subjectivity and interindividual and intraindividual variations. An alternative to manually created ground-truth data is to generate synthetic images where the ground truth is known. The challenge then is to make the images sufficiently similar to the real ones to be useful in algorithm development. One of the first and most widely studied medical image analysis tasks is to automate screening for cervical cancer through Pap-smear analysis. As part of an effort to develop a new generation cervical cancer screening system, we have developed a framework for the creation of realistic synthetic bright-field microscopy images that can be used for algorithm development and benchmarking. The resulting framework has been assessed through a visual evaluation by experts with extensive experience of Pap-smear images. The results show that images produced using our described methods are realistic enough to be mistaken for real microscopy images. The developed simulation framework is very flexible and can be modified to mimic many other types of bright-field microscopy images. © 2015 The Authors. Published by Wiley Periodicals, Inc. on behalf of ISAC PMID:25573002

  14. Finger tracking for hand-held device interface using profile-matching stereo vision

    NASA Astrophysics Data System (ADS)

    Chang, Yung-Ping; Lee, Dah-Jye; Moore, Jason; Desai, Alok; Tippetts, Beau

    2013-01-01

    Hundreds of millions of people use hand-held devices frequently and control them by touching the screen with their fingers. If this method of operation is being used by people who are driving, the probability of deaths and accidents occurring substantially increases. With a non-contact control interface, people do not need to touch the screen. As a result, people will not need to pay as much attention to their phones and thus drive more safely than they would otherwise. This interface can be achieved with real-time stereovision. A novel Intensity Profile Shape-Matching Algorithm is able to obtain 3-D information from a pair of stereo images in real time. While this algorithm does have a trade-off between accuracy and processing speed, the result of this algorithm proves the accuracy is sufficient for the practical use of recognizing human poses and finger movement tracking. By choosing an interval of disparity, an object at a certain distance range can be segmented. In other words, we detect the object by its distance to the cameras. The advantage of this profile shape-matching algorithm is that detection of correspondences relies on the shape of profile and not on intensity values, which are subjected to lighting variations. Based on the resulting 3-D information, the movement of fingers in space from a specific distance can be determined. Finger location and movement can then be analyzed for non-contact control of hand-held devices.

  15. A Pipeline To Enhance Ligand Virtual Screening: Integrating Molecular Dynamics and Fingerprints for Ligand and Proteins.

    PubMed

    Spyrakis, Francesca; Benedetti, Paolo; Decherchi, Sergio; Rocchia, Walter; Cavalli, Andrea; Alcaro, Stefano; Ortuso, Francesco; Baroni, Massimo; Cruciani, Gabriele

    2015-10-26

    The importance of taking into account protein flexibility in drug design and virtual ligand screening (VS) has been widely debated in the literature, and molecular dynamics (MD) has been recognized as one of the most powerful tools for investigating intrinsic protein dynamics. Nevertheless, deciphering the amount of information hidden in MD simulations and recognizing a significant minimal set of states to be used in virtual screening experiments can be quite complicated. Here we present an integrated MD-FLAP (molecular dynamics-fingerprints for ligand and proteins) approach, comprising a pipeline of molecular dynamics, clustering and linear discriminant analysis, for enhancing accuracy and efficacy in VS campaigns. We first extracted a limited number of representative structures from tens of nanoseconds of MD trajectories by means of the k-medoids clustering algorithm as implemented in the BiKi Life Science Suite ( http://www.bikitech.com [accessed July 21, 2015]). Then, instead of applying arbitrary selection criteria, that is, RMSD, pharmacophore properties, or enrichment performances, we allowed the linear discriminant analysis algorithm implemented in FLAP ( http://www.moldiscovery.com [accessed July 21, 2015]) to automatically choose the best performing conformational states among medoids and X-ray structures. Retrospective virtual screenings confirmed that ensemble receptor protocols outperform single rigid receptor approaches, proved that computationally generated conformations comprise the same quantity/quality of information included in X-ray structures, and pointed to the MD-FLAP approach as a valuable tool for improving VS performances.

  16. Retrospective study evaluating the performance of a first-trimester combined screening for trisomy 21 in an Italian unselected population

    PubMed Central

    Padula, Francesco; Cignini, Pietro; Giannarelli, Diana; Brizzi, Cristiana; Coco, Claudio; D’Emidio, Laura; Giorgio, Elsa; Giorlandino, Maurizio; Mangiafico, Lucia; Mastrandrea, Marialuisa; Milite, Vincenzo; Mobili, Luisa; Nanni, Cinzia; Raffio, Raffaella; Taramanni, Cinzia; Vigna, Roberto; Mesoraca, Alvaro; Bizzoco, Domenico; Gabrielli, Ivan; Di Giacomo, Gianluca; Barone, Maria Antonietta; Cima, Antonella; Giorlandino, Francesca Romana; Emili, Sabrina; Cupellaro, Marina; Giorlandino, Claudio

    2014-01-01

    Objectives to assess the performance of a combined first-trimester screening for trisomy 21 in an unselected Italian population referred to a specialized private center for prenatal medicine. Methods a retrospective validation of first-trimester screening algorithms [risk calculation based on maternal age and nuchal translucency (NT) alone, maternal age and serum parameters (free β-hCG and PAPP-A) alone and a combination of both] for fetal aneuploidies evaluated in an unselected Italian population at Artemisia Fetal-Maternal Medical Centre in Rome. All measurements were performed between 11+0 and 13+6 weeks of gestation, between April 2007 and December 2008. Results of 3,610 single fetuses included in the study, we had a complete follow-up on 2,984. Fourteen of 17 cases of trisomy 21 were detected when a cut-off of 1:300 was applied [detection rate (DR) 82.4%, 95% confidence interval (CI) 64.2–100; false-positive rate (FPR) 4.7%, 95% CI 3.9–5.4; false-negative rate (FNR) 17.6%, 95% CI 0–35.8%]. Conclusion in our study population the detection rate for trisomy 21, using the combined risk calculation based on maternal age, fetal NT, maternal PAPP-A and free β-hCG levels, was superior to the application of either parameter alone. The algorithm has been validated for first trimester screening in the Italian population. PMID:26266002

  17. Improvement of the user interface of multimedia applications by automatic display layout

    NASA Astrophysics Data System (ADS)

    Lueders, Peter; Ernst, Rolf

    1995-03-01

    Multimedia research has mainly focussed on real-time data capturing and display combined with compression, storage and transmission of these data. However, there is another problem considering real-time selecting and arranging a possibly large amount of data from multiple media on the computer screen together with textual and graphical data of regular software. This problem has already been known from complex software systems, such as CASE and hypertest, and will even be aggravated in multimedia systems. The aim of our work is to alleviate the user from the burden of continuously selecting, placing and sizing windows and their contents, but without introducing solutions limited to only few applications. We present an experimental system which controls the computer screen contents and layouts, directed by a user and/or tool provided information filter and prioritization. To be application independent, the screen layout is based on general layout optimization algorithms adapted from the VLSI layout which are controlled by application specific objective functions. In this paper, we discuss the problems of a comprehensible screen layout including the stability of optical information in time, the information filtering, the layout algorithms and the adaptation of the objective function to include a specific application. We give some examples of different standard applications with layout problems ranging from hierarchical graph layout to window layout. The results show that the automatic tool independent display layout will be possible in a real time interactive environment.

  18. Setting up a probe based, closed tube real-time PCR assay for focused detection of variable sequence alterations.

    PubMed

    Becságh, Péter; Szakács, Orsolya

    2014-10-01

    During diagnostic workflow when detecting sequence alterations, sometimes it is important to design an algorithm that includes screening and direct tests in combination. Normally the use of direct test, which is mainly sequencing, is limited. There is an increased need for effective screening tests, with "closed tube" during the whole process and therefore decreasing the risk of PCR product contamination. The aim of this study was to design such a closed tube, detection probe based screening assay to detect different kind of sequence alterations in the exon 11 of the human c-kit gene region. Inside this region there are variable possible deletions and single nucleotide changes. During assay setup, more probe chemistry formats were screened and tested. After some optimization steps the taqman probe format was selected.

  19. tcpl: the ToxCast pipeline for high-throughput screening data.

    PubMed

    Filer, Dayne L; Kothiya, Parth; Setzer, R Woodrow; Judson, Richard S; Martin, Matthew T

    2017-02-15

    Large high-throughput screening (HTS) efforts are widely used in drug development and chemical toxicity screening. Wide use and integration of these data can benefit from an efficient, transparent and reproducible data pipeline. Summary: The tcpl R package and its associated MySQL database provide a generalized platform for efficiently storing, normalizing and dose-response modeling of large high-throughput and high-content chemical screening data. The novel dose-response modeling algorithm has been tested against millions of diverse dose-response series, and robustly fits data with outliers and cytotoxicity-related signal loss. tcpl is freely available on the Comprehensive R Archive Network under the GPL-2 license. martin.matt@epa.gov. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  20. Active-learning strategies in computer-assisted drug discovery.

    PubMed

    Reker, Daniel; Schneider, Gisbert

    2015-04-01

    High-throughput compound screening is time and resource consuming, and considerable effort is invested into screening compound libraries, profiling, and selecting the most promising candidates for further testing. Active-learning methods assist the selection process by focusing on areas of chemical space that have the greatest chance of success while considering structural novelty. The core feature of these algorithms is their ability to adapt the structure-activity landscapes through feedback. Instead of full-deck screening, only focused subsets of compounds are tested, and the experimental readout is used to refine molecule selection for subsequent screening cycles. Once implemented, these techniques have the potential to reduce costs and save precious materials. Here, we provide a comprehensive overview of the various computational active-learning approaches and outline their potential for drug discovery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Postnatal Growth and Retinopathy of Prematurity Study: Rationale, Design, and Subject Characteristics.

    PubMed

    Binenbaum, Gil; Tomlinson, Lauren A

    2017-02-01

    Postnatal-growth-based predictive models demonstrate strong potential for improving the low specificity of retinopathy of prematurity (ROP) screening. Prior studies are limited by inadequate sample size. We sought to study a sufficiently large cohort of at-risk infants to enable development of a model with highly precise estimates of sensitivity for severe ROP. The Postnatal Growth and ROP (G-ROP) Study was a multicenter retrospective cohort study of infants at 30 North American hospitals during 2006-2012. A total of 65 G-ROP-certified abstractors submitted data to a secure, web-based database. Data included ROP examination findings, treatments, complications, daily weight measurements, daily oxygen supplementation, maternal/infant demographics, medical comorbidities, surgical events, and weekly nutrition. Data quality was monitored with system validation rules, data audits, and discrepancy algorithms. Of 11,261 screened infants, 8334 were enrolled, and 2927 had insufficient data due to transfer, discharge, or death. Of the enrolled infants, 90% (7483) had a known ROP outcome and were included in the study. Median birth weight was 1070 g (range 310-3000g) and mean gestational age 28 weeks (range 22-35 weeks). Severe ROP (Early Treatment of Retinopathy type 1 or 2) developed in 931 infants (12.5%). Successful incorporation of a predictive model into ROP screening requires confidence that it will capture cases of severe ROP. This dataset provides power to estimate sensitivity with half-confidence interval width of less than 0.5%, determined by the high number of severe ROP cases. The G-ROP Study represents a large, diverse cohort of at-risk infants undergoing ROP screening. It will facilitate evaluation of growth-based algorithms to improve efficiency of ROP screening.

  2. Screening high-risk patients and assisting in diagnosing anxiety in primary care: the Patient Health Questionnaire evaluated

    PubMed Central

    2013-01-01

    Background Questionnaires may help in detecting and diagnosing anxiety disorders in primary care. However, since utility of these questionnaires in target populations is rarely studied, the Patient Health Questionnaire anxiety modules (PHQ) were evaluated for use as: a) a screener in high-risk patients, and/or b) a case finder for general practitioners (GPs) to assist in diagnosing anxiety disorders. Methods A cross-sectional analysis was performed in 43 primary care practices in the Netherlands. The added value of the PHQ was assessed in two samples: 1) 170 patients at risk of anxiety disorders (or developing them) according to their electronic medical records (high-risk sample); 2) 141 patients identified as a possible ‘anxiety case’ by a GP (GP-identified sample). All patients completed the PHQ and were interviewed using the Mini International Neuropsychiatric interview to classify DSM-IV anxiety disorders. Psychometric properties were calculated, and a logistic regression analysis was performed to assess the diagnostic value of the PHQ. Results Using only the screening questions of the PHQ, the area under the curve was 83% in the high-risk sample. In GP-identified patients the official algorithm showed the best characteristics with an area under the curve of 77%. Positive screening questions significantly increased the odds of an anxiety disorder diagnosis in high-risk patients (odds ratio = 23.4; 95% confidence interval 6.9 to 78.8) as did a positive algorithm in GP-identified patients (odds ratio = 13.9; 95% confidence interval 3.8 to 50.6). Conclusions The PHQ screening questions can be used to screen for anxiety disorders in high-risk primary care patients. In GP-identified patients, the benefit of the PHQ is less evident. PMID:23865984

  3. Radiation dose reduction for CT lung cancer screening using ASIR and MBIR: a phantom study

    PubMed Central

    Mathieu, Kelsey B.; Ai, Hua; Fox, Patricia S.; Godoy, Myrna Cobos Barco; Munden, Reginald F.; de Groot, Patricia M.

    2014-01-01

    The purpose of this study was to reduce the radiation dosage associated with computed tomography (CT) lung cancer screening while maintaining overall diagnostic image quality and definition of ground‐glass opacities (GGOs). A lung screening phantom and a multipurpose chest phantom were used to quantitatively assess the performance of two iterative image reconstruction algorithms (adaptive statistical iterative reconstruction (ASIR) and model‐based iterative reconstruction (MBIR)) used in conjunction with reduced tube currents relative to a standard clinical lung cancer screening protocol (51 effective mAs (3.9 mGy) and filtered back‐projection (FBP) reconstruction). To further assess the algorithms' performances, qualitative image analysis was conducted (in the form of a reader study) using the multipurpose chest phantom, which was implanted with GGOs of two densities. Our quantitative image analysis indicated that tube current, and thus radiation dose, could be reduced by 40% or 80% from ASIR or MBIR, respectively, compared with conventional FBP, while maintaining similar image noise magnitude and contrast‐to‐noise ratio. The qualitative portion of our study, which assessed reader preference, yielded similar results, indicating that dose could be reduced by 60% (to 20 effective mAs (1.6 mGy)) with either ASIR or MBIR, while maintaining GGO definition. Additionally, the readers' preferences (as indicated by their ratings) regarding overall image quality were equal or better (for a given dose) when using ASIR or MBIR, compared with FBP. In conclusion, combining ASIR or MBIR with reduced tube current may allow for lower doses while maintaining overall diagnostic image quality, as well as GGO definition, during CT lung cancer screening. PACS numbers: 87.57.Q‐, 87.57.nf PMID:24710436

  4. Visual performance-based image enhancement methodology: an investigation of contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Neriani, Kelly E.; Herbranson, Travis J.; Reis, George A.; Pinkus, Alan R.; Goodyear, Charles D.

    2006-05-01

    While vast numbers of image enhancing algorithms have already been developed, the majority of these algorithms have not been assessed in terms of their visual performance-enhancing effects using militarily relevant scenarios. The goal of this research was to apply a visual performance-based assessment methodology to evaluate six algorithms that were specifically designed to enhance the contrast of digital images. The image enhancing algorithms used in this study included three different histogram equalization algorithms, the Autolevels function, the Recursive Rational Filter technique described in Marsi, Ramponi, and Carrato1 and the multiscale Retinex algorithm described in Rahman, Jobson and Woodell2. The methodology used in the assessment has been developed to acquire objective human visual performance data as a means of evaluating the contrast enhancement algorithms. Objective performance metrics, response time and error rate, were used to compare algorithm enhanced images versus two baseline conditions, original non-enhanced images and contrast-degraded images. Observers completed a visual search task using a spatial-forcedchoice paradigm. Observers searched images for a target (a military vehicle) hidden among foliage and then indicated in which quadrant of the screen the target was located. Response time and percent correct were measured for each observer. Results of the study and future directions are discussed.

  5. Applying a visual language for image processing as a graphical teaching tool in medical imaging

    NASA Astrophysics Data System (ADS)

    Birchman, James J.; Tanimoto, Steven L.; Rowberg, Alan H.; Choi, Hyung-Sik; Kim, Yongmin

    1992-05-01

    Typical user interaction in image processing is with command line entries, pull-down menus, or text menu selections from a list, and as such is not generally graphical in nature. Although applying these interactive methods to construct more sophisticated algorithms from a series of simple image processing steps may be clear to engineers and programmers, it may not be clear to clinicians. A solution to this problem is to implement a visual programming language using visual representations to express image processing algorithms. Visual representations promote a more natural and rapid understanding of image processing algorithms by providing more visual insight into what the algorithms do than the interactive methods mentioned above can provide. Individuals accustomed to dealing with images will be more likely to understand an algorithm that is represented visually. This is especially true of referring physicians, such as surgeons in an intensive care unit. With the increasing acceptance of picture archiving and communications system (PACS) workstations and the trend toward increasing clinical use of image processing, referring physicians will need to learn more sophisticated concepts than simply image access and display. If the procedures that they perform commonly, such as window width and window level adjustment and image enhancement using unsharp masking, are depicted visually in an interactive environment, it will be easier for them to learn and apply these concepts. The software described in this paper is a visual programming language for imaging processing which has been implemented on the NeXT computer using NeXTstep user interface development tools and other tools in an object-oriented environment. The concept is based upon the description of a visual language titled `Visualization of Vision Algorithms' (VIVA). Iconic representations of simple image processing steps are placed into a workbench screen and connected together into a dataflow path by the user. As the user creates and edits a dataflow path, more complex algorithms can be built on the screen. Once the algorithm is built, it can be executed, its results can be reviewed, and operator parameters can be interactively adjusted until an optimized output is produced. The optimized algorithm can then be saved and added to the system as a new operator. This system has been evaluated as a graphical teaching tool for window width and window level adjustment, image enhancement using unsharp masking, and other techniques.

  6. An operational retrieval algorithm for determining aerosol optical properties in the ultraviolet

    NASA Astrophysics Data System (ADS)

    Taylor, Thomas E.; L'Ecuyer, Tristan S.; Slusser, James R.; Stephens, Graeme L.; Goering, Christian D.

    2008-02-01

    This paper describes a number of practical considerations concerning the optimization and operational implementation of an algorithm used to characterize the optical properties of aerosols across part of the ultraviolet (UV) spectrum. The algorithm estimates values of aerosol optical depth (AOD) and aerosol single scattering albedo (SSA) at seven wavelengths in the UV, as well as total column ozone (TOC) and wavelength-independent asymmetry factor (g) using direct and diffuse irradiances measured with a UV multifilter rotating shadowband radiometer (UV-MFRSR). A novel method for cloud screening the irradiance data set is introduced, as well as several improvements and optimizations to the retrieval scheme which yield a more realistic physical model for the inversion and increase the efficiency of the algorithm. Introduction of a wavelength-dependent retrieval error budget generated from rigorous forward model analysis as well as broadened covariances on the a priori values of AOD, SSA and g and tightened covariances of TOC allows sufficient retrieval sensitivity and resolution to obtain unique solutions of aerosol optical properties as demonstrated by synthetic retrievals. Analysis of a cloud screened data set (May 2003) from Panther Junction, Texas, demonstrates that the algorithm produces realistic values of the optical properties that compare favorably with pseudo-independent methods for AOD, TOC and calculated Ångstrom exponents. Retrieval errors of all parameters (except TOC) are shown to be negatively correlated to AOD, while the Shannon information content is positively correlated, indicating that retrieval skill improves with increasing atmospheric turbidity. When implemented operationally on more than thirty instruments in the Ultraviolet Monitoring and Research Program's (UVMRP) network, this retrieval algorithm will provide a comprehensive and internally consistent climatology of ground-based aerosol properties in the UV spectral range that can be used for both validation of satellite measurements as well as regional aerosol and ultraviolet transmission studies.

  7. Strategies to improve the efficiency of celiac disease diagnosis in the laboratory.

    PubMed

    González, Delia Almeida; de Armas, Laura García; Rodríguez, Itahisa Marcelino; Almeida, Ana Arencibia; García, Miriam García; Gannar, Fadoua; de León, Antonio Cabrera

    2017-10-01

    The demand for testing to detect celiac disease (CD) autoantibodies has increased, together with the cost per case diagnosed, resulting in the adoption of measures to restrict laboratory testing. We designed this study to determine whether opportunistic screening to detect CD-associated autoantibodies had advantages compared to efforts to restrict testing, and to identify the most cost-effective diagnostic strategy. We compared a group of 1678 patients in which autoantibody testing was restricted to cases in which the test referral was considered appropriate (G1) to a group of 2140 patients in which test referrals were not reviewed or restricted (G2). Two algorithms A (quantifying IgA and Tissue transglutaminase IgA [TG-IgA] in all patients), and B (quantifying only TG-IgA in all patients) were used in each group, and the cost-effectiveness of each strategy was calculated. TG-IgA autoantibodies were positive in 62 G1 patients and 69 G2 patients. Among those positive for tissue transglutaminase IgA and endomysial IgA autoantibodies, the proportion of patients with de novo autoantibodies was lower (p=0.028) in G1 (11/62) than in G2 (24/69). Algorithm B required fewer determinations than algorithm A in both G1 (2310 vs 3493; p<0.001) and G2 (2196 vs 4435; p<0.001). With algorithm B the proportion of patients in whom IgA was tested was lower (p<0.001) in G2 (29/2140) than in G1 (617/1678). The lowest cost per case diagnosed (4.63 euros/patient) was found with algorithm B in G2. We conclude that opportunistic screening has advantages compared to efforts in the laboratory to restrict CD diagnostic testing. The most cost-effective strategy was based on the use of an appropriate algorithm. Copyright © 2017. Published by Elsevier B.V.

  8. Low cost automated whole smear microscopy screening system for detection of acid fast bacilli.

    PubMed

    Law, Yan Nei; Jian, Hanbin; Lo, Norman W S; Ip, Margaret; Chan, Mia Mei Yuk; Kam, Kai Man; Wu, Xiaohua

    2018-01-01

    In countries with high tuberculosis (TB) burden, there is urgent need for rapid, large-scale screening to detect smear-positive patients. We developed a computer-aided whole smear screening system that focuses in real-time, captures images and provides diagnostic grading, for both bright-field and fluorescence microscopy for detection of acid-fast-bacilli (AFB) from respiratory specimens. To evaluate the performance of dual-mode screening system in AFB diagnostic algorithms on concentrated smears with auramine O (AO) staining, as well as direct smears with AO and Ziehl-Neelsen (ZN) staining, using mycobacterial culture results as gold standard. Adult patient sputum samples requesting for M. tuberculosis cultures were divided into three batches for staining: direct AO-stained, direct ZN-stained and concentrated smears AO-stained. All slides were graded by an experienced microscopist, in parallel with the automated whole smear screening system. Sensitivity and specificity of a TB diagnostic algorithm in using the screening system alone, and in combination with a microscopist, were evaluated. Of 488 direct AO-stained smears, 228 were culture positive. These yielded a sensitivity of 81.6% and specificity of 74.2%. Of 334 direct smears with ZN staining, 142 were culture positive, which gave a sensitivity of 70.4% and specificity of 76.6%. Of 505 concentrated smears with AO staining, 250 were culture positive, giving a sensitivity of 86.4% and specificity of 71.0%. To further improve performance, machine grading was confirmed by manual smear grading when the number of AFBs detected fell within an uncertainty range. These combined results gave significant improvement in specificity (AO-direct:85.4%; ZN-direct:85.4%; AO-concentrated:92.5%) and slight improvement in sensitivity while requiring only limited manual workload. Our system achieved high sensitivity without substantially compromising specificity when compared to culture results. Significant improvement in specificity was obtained when uncertain results were confirmed by manual smear grading. This approach had potential to substantially reduce workload of microscopists in high burden countries.

  9. A safe and effective management strategy for blunt cerebrovascular injury: Avoiding unnecessary anticoagulation and eliminating stroke.

    PubMed

    Shahan, Charles P; Magnotti, Louis J; Stickley, Shaun M; Weinberg, Jordan A; Hendrick, Leah E; Uhlmann, Rebecca A; Schroeppel, Thomas J; Hoit, Daniel A; Croce, Martin A; Fabian, Timothy C

    2016-06-01

    Few injuries have produced as much debate with respect to management as have blunt cerebrovascular injuries (BCVIs). Recent work (American Association for the Surgery of Trauma 2013) from our institution suggested that 64-channel multidetector computed tomographic angiography (CTA) could be the primary screening tool for BCVI. Consequently, our screening algorithm changed from digital subtraction angiography (DSA) to CTA, with DSA reserved for definitive diagnosis of BCVI following CTA-positive study results or unexplained neurologic findings. The current study was performed to evaluate outcomes, including the potential for missed clinically significant BCVI, since this new management algorithm was adopted. Patients who underwent DSA (positive CTA finding or unexplained neurologic finding) over an 18-month period subsequent to the previous study were identified. Screening and confirmatory test results, complications, and BCVI-related strokes were reviewed and compared. A total of 228 patients underwent DSA: 64% were male, with mean age and Injury Severity Score (ISS) of 43 years and 22, respectively. A total of 189 patients (83%) had a positive screening CTA result. Of these, DSA confirmed injury in 104 patients (55%); the remaining 85 patients (45%) (false-positive results) were found to have no injury on DSA. Five patients (4.8%) experienced BCVI-related strokes, unchanged from the previous study (3.9%, p = 0.756); two were symptomatic at trauma center presentation, and three occurred while receiving appropriate therapy. No patient with a negative screening CTA result experienced a stroke. This management scheme using 64-channel CTA for screening coupled with DSA for definitive diagnosis was proven to be safe and effective in identifying clinically significant BCVIs and maintaining a low stroke rate. Definitive diagnosis by DSA led to avoidance of potentially harmful anticoagulation in 45% of CTA-positive patients (false-positive results). No strokes resulted from injuries missed by CTA. Diagnostic study, level III.

  10. Health economics of screening for gynaecological cancers.

    PubMed

    Kulasingam, Shalini; Havrilesky, Laura

    2012-04-01

    In this chapter, we summarise findings from recent cost-effectiveness analyses of screening for cervical cancer and ovarian cancer. We begin with a brief summary of key issues that affect the cost-effectiveness of screening, including disease burden, and availability and type of screening tests. For cervical cancer, we discuss the potential effect of human papilloma virus vaccines on screening. Outstanding epidemiological and cost-effectiveness issues are included. For cervical cancer, this includes incorporating the long-term effect of treatment (including adverse birth outcomes in treated women who are of reproductive age) into cost-effectiveness models using newly available trial data to identify the best strategy for incorporating human papilloma virus tests. A second issue is the need for additional data on human papilloma virus vaccines, such as effectiveness of reduced cancer incidence and mortality, effectiveness in previously exposed women and coverage. Definitive data on these parameters will allow us to update model-based analyses to include more realistic estimates, and also potentially dramatically alter our approach to screening. For ovarian cancer, outstanding issues include confirming within the context of a trial that screening is effective for reducing mortality and incorporating tests with high specificity into screening into screening algorithms for ovarian cancer. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. A first trimester trisomy 13/trisomy 18 risk algorithm combining fetal nuchal translucency thickness, maternal serum free beta-hCG and PAPP-A.

    PubMed

    Spencer, Kevin; Nicolaides, Kypros H

    2002-10-01

    This study examines 45 cases of trisomy 13 and 59 cases of trisomy 18 and reports an algorithm to identify pregnancies with a fetus affected by trisomy 13 or 18 by a combination of maternal age fetal nuchal translucency (NT) thickness, and maternal serum free beta-hCG and PAPP-A at 11-14 weeks of gestation. In this mixed trisomy group the median MoM NT was increased at 2.819, whilst the median MoMs for free beta-hCG and PAPP-A were reduced at 0.375 and 0.201 respectively. We predict that with the use of the combined trisomy 13 and 18 algorithm and a risk cut-off of 1 in 150 will for a 0.3% false positive rate allow 95% of these chromosomal defects to be identified at 11-14 weeks. Such algorithms will enhance existing first trimester screening algorithms for trisomy 21. Copyright 2002 John Wiley & Sons, Ltd.

  12. Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video

    PubMed Central

    Lee, Gil-beom; Lee, Myeong-jin; Lee, Woo-Kyung; Park, Joo-heon; Kim, Tae-Hwan

    2017-01-01

    Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos. PMID:28327515

  13. Consistency of Global Modis Aerosol Optical Depths over Ocean on Terra and Aqua Ceres SSF Datasets

    NASA Technical Reports Server (NTRS)

    Ignatov, Alexander; Minnis, Patrick; Miller, Walter F.; Wielicki, Bruce A.; Remer, Lorraine

    2006-01-01

    Aerosol retrievals over ocean from the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard Terra and Aqua platforms are available from the Clouds and the Earth's Radiant Energy System (CERES) Single Scanner Footprint (SSF) datasets generated at NASA Langley Research Center (LaRC). Two aerosol products are reported side-by-side. The primary M product is generated by sub-setting and remapping the multi-spectral (0.47-2.1 micrometer) MODIS produced oceanic aerosol (MOD04/MYD04 for Terra/Aqua) onto CERES footprints. M*D04 processing uses cloud screening and aerosol algorithms developed by the MODIS science team. The secondary AVHRR-like A product is generated in only two MODIS bands 1 and 6 (on Aqua, bands 1 and 7). The A processing uses the CERES cloud screening algorithm, and NOAA/NESDIS glint identification, and single-channel aerosol retrieval algorithms. The M and A products have been documented elsewhere and preliminarily compared using 2 weeks of global Terra CERES SSF Edition 1A data in which the M product was based on MOD04 collection 3. In this study, the comparisons between the M and A aerosol optical depths (AOD) in MODIS band 1 (0.64 micrometers), tau(sub 1M) and tau(sub 1A) are re-examined using 9 days of global CERES SSF Terra Edition 2A and Aqua Edition 1B data from 13 - 21 October 2002, and extended to include cross-platform comparisons. The M and A products on the new CERES SSF release are generated using the same aerosol algorithms as before, but with different preprocessing and sampling procedures, lending themselves to a simple sensitivity check to non-aerosol factors. Both tau(sub 1M) and tau(sub 1A) generally compare well across platforms. However, the M product shows some differences, which increase with ambient cloud amount and towards the solar side of the orbit. Three types of comparisons conducted in this study - cross-platform, cross-product, and cross-release confirm the previously made observation that the major area for improvement in the current aerosol processing lies in a more formalized and standardized sampling (and most importantly, cloud screening) whereas optimization of the aerosol algorithm is deemed to be an important yet less critical element.

  14. Screening for prenatal substance use: development of the Substance Use Risk Profile-Pregnancy scale.

    PubMed

    Yonkers, Kimberly A; Gotman, Nathan; Kershaw, Trace; Forray, Ariadna; Howell, Heather B; Rounsaville, Bruce J

    2010-10-01

    To report on the development of a questionnaire to screen for hazardous substance use in pregnant women and to compare the performance of the questionnaire with other drug and alcohol measures. Pregnant women were administered a modified TWEAK (Tolerance, Worried, Eye-openers, Amnesia, K[C] Cut Down) questionnaire, the 4Ps Plus questionnaire, items from the Addiction Severity Index, and two questions about domestic violence (N=2,684). The sample was divided into "training" (n=1,610) and "validation" (n=1,074) subsamples. We applied recursive partitioning class analysis to the responses from individuals in the training subsample that resulted in a three-item Substance Use Risk Profile-Pregnancy scale. We examined sensitivity, specificity, and the fit of logistic regression models in the validation subsample to compare the performance of the Substance Use Risk Profile-Pregnancy scale with the modified TWEAK and various scoring algorithms of the 4Ps. The Substance Use Risk Profile-Pregnancy scale is comprised of three informative questions that can be scored for high- or low-risk populations. The Substance Use Risk Profile-Pregnancy scale algorithm for low-risk populations was mostly highly predictive of substance use in the validation subsample (Akaike's Information Criterion=579.75, Nagelkerke R=0.27) with high sensitivity (91%) and adequate specificity (67%). The high-risk algorithm had lower sensitivity (57%) but higher specificity (88%). The Substance Use Risk Profile-Pregnancy scale is simple and flexible with good sensitivity and specificity. The Substance Use Risk Profile-Pregnancy scale can potentially detect a range of substances that may be abused. Clinicians need to further assess women with a positive screen to identify those who require treatment for alcohol or illicit substance use in pregnancy. III.

  15. A mobile platform for automated screening of asthma and chronic obstructive pulmonary disease.

    PubMed

    Chamberlain, Daniel B; Kodgule, Rahul; Fletcher, Richard Ribon

    2016-08-01

    Chronic Obstructive Pulmonary Disease (COPD) and asthma each represent a large proportion of the global disease burden; COPD is the third leading cause of death worldwide and asthma is one of the most prevalent chronic diseases, afflicting over 300 million people. Much of this burden is concentrated in the developing world, where patients lack access to physicians trained in the diagnosis of pulmonary disease. As a result, these patients experience high rates of underdiagnosis and misdiagnosis. To address this need, we present a mobile platform capable of screening for Asthma and COPD. Our solution is based on a mobile smart phone and consists of an electronic stethoscope, a peak flow meter application, and a patient questionnaire. This data is combined with a machine learning algorithm to identify patients with asthma and COPD. To test and validate the design, we collected data from 119 healthy and sick participants using our custom mobile application and ran the analysis on a PC computer. For comparison, all subjects were examined by an experienced pulmonologist using a full pulmonary testing laboratory. Employing a two-stage logistic regression model, our algorithms were first able to identify patients with either asthma or COPD from the general population, yielding an ROC curve with an AUC of 0.95. Then, after identifying these patients, our algorithm was able to distinguish between patients with asthma and patients with COPD, yielding an ROC curve with AUC of 0.97. This work represents an important milestone towards creating a self-contained mobile phone-based platform that can be used for screening and diagnosis of pulmonary disease in many parts of the world.

  16. Diagnostic Performance of a Smartphone-Based Photoplethysmographic Application for Atrial Fibrillation Screening in a Primary Care Setting.

    PubMed

    Chan, Pak-Hei; Wong, Chun-Ka; Poh, Yukkee C; Pun, Louise; Leung, Wangie Wan-Chiu; Wong, Yu-Fai; Wong, Michelle Man-Ying; Poh, Ming-Zher; Chu, Daniel Wai-Sing; Siu, Chung-Wah

    2016-07-21

    Diagnosing atrial fibrillation (AF) before ischemic stroke occurs is a priority for stroke prevention in AF. Smartphone camera-based photoplethysmographic (PPG) pulse waveform measurement discriminates between different heart rhythms, but its ability to diagnose AF in real-world situations has not been adequately investigated. We sought to assess the diagnostic performance of a standalone smartphone PPG application, Cardiio Rhythm, for AF screening in primary care setting. Patients with hypertension, with diabetes mellitus, and/or aged ≥65 years were recruited. A single-lead ECG was recorded by using the AliveCor heart monitor with tracings reviewed subsequently by 2 cardiologists to provide the reference standard. PPG measurements were performed by using the Cardiio Rhythm smartphone application. AF was diagnosed in 28 (2.76%) of 1013 participants. The diagnostic sensitivity of the Cardiio Rhythm for AF detection was 92.9% (95% CI] 77-99%) and was higher than that of the AliveCor automated algorithm (71.4% [95% CI 51-87%]). The specificities of Cardiio Rhythm and the AliveCor automated algorithm were comparable (97.7% [95% CI: 97-99%] versus 99.4% [95% CI 99-100%]). The positive predictive value of the Cardiio Rhythm was lower than that of the AliveCor automated algorithm (53.1% [95% CI 38-67%] versus 76.9% [95% CI 56-91%]); both had a very high negative predictive value (99.8% [95% CI 99-100%] versus 99.2% [95% CI 98-100%]). The Cardiio Rhythm smartphone PPG application provides an accurate and reliable means to detect AF in patients at risk of developing AF and has the potential to enable population-based screening for AF. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  17. Screening for Prenatal Substance Use

    PubMed Central

    Yonkers, Kimberly A.; Gotman, Nathan; Kershaw, Trace; Forray, Ariadna; Howell, Heather B.; Rounsaville, Bruce J.

    2011-01-01

    OBJECTIVE To report on the development of a questionnaire to screen for hazardous substance use in pregnant women and to compare the performance of the questionnaire with other drug and alcohol measures. METHODS Pregnant women were administered a modified TWEAK (Tolerance, Worried, Eye-openers, Amnesia, K[C] Cut Down) questionnaire, the 4Ps Plus questionnaire, items from the Addiction Severity Index, and two questions about domestic violence (N=2,684). The sample was divided into “training” (n=1,610) and “validation” (n=1,074) subsamples. We applied recursive partitioning class analysis to the responses from individuals in the training subsample that resulted in a three-item Substance Use Risk Profile-Pregnancy scale. We examined sensitivity, specificity, and the fit of logistic regression models in the validation subsample to compare the performance of the Substance Use Risk Profile-Pregnancy scale with the modified TWEAK and various scoring algorithms of the 4Ps. RESULTS The Substance Use Risk Profile-Pregnancy scale is comprised of three informative questions that can be scored for high- or low-risk populations. The Substance Use Risk Profile-Pregnancy scale algorithm for low-risk populations was mostly highly predictive of substance use in the validation subsample (Akaike’s Information Criterion=579.75, Nagelkerke R2=0.27) with high sensitivity (91%) and adequate specificity (67%). The high-risk algorithm had lower sensitivity (57%) but higher specificity (88%). CONCLUSION The Substance Use Risk Profile-Pregnancy scale is simple and flexible with good sensitivity and specificity. The Substance Use Risk Profile-Pregnancy scale can potentially detect a range of substances that may be abused. Clinicians need to further assess women with a positive screen to identify those who require treatment for alcohol or illicit substance use in pregnancy. PMID:20859145

  18. Operationalizing hippocampal volume as an enrichment biomarker for amnestic mild cognitive impairment trials: effect of algorithm, test-retest variability, and cut point on trial cost, duration, and sample size.

    PubMed

    Yu, Peng; Sun, Jia; Wolz, Robin; Stephenson, Diane; Brewer, James; Fox, Nick C; Cole, Patricia E; Jack, Clifford R; Hill, Derek L G; Schwarz, Adam J

    2014-04-01

    The objective of this study was to evaluate the effect of computational algorithm, measurement variability, and cut point on hippocampal volume (HCV)-based patient selection for clinical trials in mild cognitive impairment (MCI). We used normal control and amnestic MCI subjects from the Alzheimer's Disease Neuroimaging Initiative 1 (ADNI-1) as normative reference and screening cohorts. We evaluated the enrichment performance of 4 widely used hippocampal segmentation algorithms (FreeSurfer, Hippocampus Multi-Atlas Propagation and Segmentation (HMAPS), Learning Embeddings Atlas Propagation (LEAP), and NeuroQuant) in terms of 2-year changes in Mini-Mental State Examination (MMSE), Alzheimer's Disease Assessment Scale-Cognitive Subscale (ADAS-Cog), and Clinical Dementia Rating Sum of Boxes (CDR-SB). We modeled the implications for sample size, screen fail rates, and trial cost and duration. HCV based patient selection yielded reduced sample sizes (by ∼40%-60%) and lower trial costs (by ∼30%-40%) across a wide range of cut points. These results provide a guide to the choice of HCV cut point for amnestic MCI clinical trials, allowing an informed tradeoff between statistical and practical considerations. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Raman spectroscopy and imaging to detect contaminants for food safety applications

    NASA Astrophysics Data System (ADS)

    Chao, Kuanglin; Qin, Jianwei; Kim, Moon S.; Peng, Yankun; Chan, Diane; Cheng, Yu-Che

    2013-05-01

    This study presents the use of Raman chemical imaging for the screening of dry milk powder for the presence of chemical contaminants and Raman spectroscopy for quantitative assessment of chemical contaminants in liquid milk. For image-based screening, melamine was mixed into dry milk at concentrations (w/w) between 0.2% and 10.0%, and images of the mixtures were analyzed by a spectral information divergence algorithm. Ammonium sulfate, dicyandiamide, and urea were each separately mixed into dry milk at concentrations (w/w) between 0.5% and 5.0%, and an algorithm based on self-modeling mixture analysis was applied to these sample images. The contaminants were successfully detected and the spatial distribution of the contaminants within the sample mixtures was visualized using these algorithms. Liquid milk mixtures were prepared with melamine at concentrations between 0.04% and 0.30%, with ammonium sulfate and with urea at concentrations between 0.1% and 10.0%, and with dicyandiamide at concentrations between 0.1% and 4.0%. Analysis of the Raman spectra from the liquid mixtures showed linear relationships between the Raman intensities and the chemical concentrations. Although further studies are necessary, Raman chemical imaging and spectroscopy show promise for use in detecting and evaluating contaminants in food ingredients.

  20. True ion pick (TIPick): a denoising and peak picking algorithm to extract ion signals from liquid chromatography/mass spectrometry data.

    PubMed

    Ho, Tsung-Jung; Kuo, Ching-Hua; Wang, San-Yuan; Chen, Guan-Yuan; Tseng, Yufeng J

    2013-02-01

    Liquid Chromatography-Time of Flight Mass Spectrometry has become an important technique for toxicological screening and metabolomics. We describe TIPick a novel algorithm that accurately and sensitively detects target compounds in biological samples. TIPick comprises two main steps: background subtraction and peak picking. By subtracting a blank chromatogram, TIPick eliminates chemical signals of blank injections and reduces false positive results. TIPick detects peaks by calculating the S(CC(INI)) values of extracted ion chromatograms (EICs) without considering peak shapes, and it is able to detect tailing and fronting peaks. TIPick also uses duplicate injections to enhance the signals of the peaks and thus improve the peak detection power. Commonly seen split peaks caused by either saturation of the mass spectrometer detector or a mathematical background subtraction algorithm can be resolved by adjusting the mass error tolerance of the EICs and by comparing the EICs before and after background subtraction. The performance of TIPick was tested in a data set containing 297 standard mixtures; the recall, precision and F-score were 0.99, 0.97 and 0.98, respectively. TIPick was successfully used to construct and analyze the NTU MetaCore metabolomics chemical standards library, and it was applied for toxicological screening and metabolomics studies. Copyright © 2013 John Wiley & Sons, Ltd.

Top