Sample records for combining statistical detection

  1. Scan statistics with local vote for target detection in distributed system

    NASA Astrophysics Data System (ADS)

    Luo, Junhai; Wu, Qi

    2017-12-01

    Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.

  2. Statistical Signal Process in R Language in the Pharmacovigilance Programme of India.

    PubMed

    Kumar, Aman; Ahuja, Jitin; Shrivastava, Tarani Prakash; Kumar, Vipin; Kalaiselvan, Vivekanandan

    2018-05-01

    The Ministry of Health & Family Welfare, Government of India, initiated the Pharmacovigilance Programme of India (PvPI) in July 2010. The purpose of the PvPI is to collect data on adverse reactions due to medications, analyze it, and use the reference to recommend informed regulatory intervention, besides communicating the risk to health care professionals and the public. The goal of the present study was to apply statistical tools to find the relationship between drugs and ADRs for signal detection by R programming. Four statistical parameters were proposed for quantitative signal detection. These 4 parameters are IC 025 , PRR and PRR lb , chi-square, and N 11 ; we calculated these 4 values using R programming. We analyzed 78,983 drug-ADR combinations, and the total count of drug-ADR combination was 4,20,060. During the calculation of the statistical parameter, we use 3 variables: (1) N 11 (number of counts), (2) N 1. (Drug margin), and (3) N .1 (ADR margin). The structure and calculation of these 4 statistical parameters in R language are easily understandable. On the basis of the IC value (IC value >0), out of the 78,983 drug-ADR combination (drug-ADR combination), we found the 8,667 combinations to be significantly associated. The calculation of statistical parameters in R language is time saving and allows to easily identify new signals in the Indian ICSR (Individual Case Safety Reports) database.

  3. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  4. Statistically significant performance results of a mine detector and fusion algorithm from an x-band high-resolution SAR

    NASA Astrophysics Data System (ADS)

    Williams, Arnold C.; Pachowicz, Peter W.

    2004-09-01

    Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.

  5. The l z ( p ) * Person-Fit Statistic in an Unfolding Model Context.

    PubMed

    Tendeiro, Jorge N

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded unfolding model is used. Results from a simulation study indicate that the person-fit statistic performed relatively well in detecting midpoint response style patterns and not so well in detecting extreme response style patterns.

  6. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    NASA Astrophysics Data System (ADS)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  7. Properties of different selection signature statistics and a new strategy for combining them.

    PubMed

    Ma, Y; Ding, X; Qanbari, S; Weigend, S; Zhang, Q; Simianer, H

    2015-11-01

    Identifying signatures of recent or ongoing selection is of high relevance in livestock population genomics. From a statistical perspective, determining a proper testing procedure and combining various test statistics is challenging. On the basis of extensive simulations in this study, we discuss the statistical properties of eight different established selection signature statistics. In the considered scenario, we show that a reasonable power to detect selection signatures is achieved with high marker density (>1 SNP/kb) as obtained from sequencing, while rather small sample sizes (~15 diploid individuals) appear to be sufficient. Most selection signature statistics such as composite likelihood ratio and cross population extended haplotype homozogysity have the highest power when fixation of the selected allele is reached, while integrated haplotype score has the highest power when selection is ongoing. We suggest a novel strategy, called de-correlated composite of multiple signals (DCMS) to combine different statistics for detecting selection signatures while accounting for the correlation between the different selection signature statistics. When examined with simulated data, DCMS consistently has a higher power than most of the single statistics and shows a reliable positional resolution. We illustrate the new statistic to the established selective sweep around the lactase gene in human HapMap data providing further evidence of the reliability of this new statistic. Then, we apply it to scan selection signatures in two chicken samples with diverse skin color. Our analysis suggests that a set of well-known genes such as BCO2, MC1R, ASIP and TYR were involved in the divergent selection for this trait.

  8. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    NASA Astrophysics Data System (ADS)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  9. Hotspot detection using space-time scan statistics on children under five years of age in Depok

    NASA Astrophysics Data System (ADS)

    Verdiana, Miranti; Widyaningsih, Yekti

    2017-03-01

    Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.

  10. Systems and methods for detection of blowout precursors in combustors

    DOEpatents

    Lieuwen, Tim C.; Nair, Suraj

    2006-08-15

    The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.

  11. Predictive Fusion of Geophysical Waveforms using Fisher's Method, under the Alternative Hypothesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel; Nemzek, Robert James; Webster, Jeremy David

    2017-05-05

    This presentation tries to understand how to combine different signatures from an event or source together in a defensible way. The objective was to build a digital detector that continuously combines detection statistics recording explosions to screen sources of interest from null sources.

  12. Human Deception Detection from Whole Body Motion Analysis

    DTIC Science & Technology

    2015-12-01

    9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL

  13. Does Assessing Eye Alignment along with Refractive Error or Visual Acuity Increase Sensitivity for Detection of Strabismus in Preschool Vision Screening?

    PubMed Central

    2007-01-01

    Purpose Preschool vision screenings often include refractive error or visual acuity (VA) testing to detect amblyopia, as well as alignment testing to detect strabismus. The purpose of this study was to determine the effect of combining screening for eye alignment with screening for refractive error or reduced VA on sensitivity for detection of strabismus, with specificity set at 90% and 94%. Methods Over 3 years, 4040 preschool children were screened in the Vision in Preschoolers (VIP) Study, with different screening tests administered each year. Examinations were performed to identify children with strabismus. The best screening tests for detecting children with any targeted condition were noncycloplegic retinoscopy (NCR), Retinomax autorefractor (Right Manufacturing, Virginia Beach, VA), SureSight Vision Screener (Welch-Allyn, Inc., Skaneateles, NY), and Lea Symbols (Precision Vision, LaSalle, IL and Good-Lite Co., Elgin, IL) and HOTV optotypes VA tests. Analyses were conducted with these tests of refractive error or VA paired with the best tests for detecting strabismus (unilateral cover testing, Random Dot “E” [RDE] and Stereo Smile Test II [Stereo Optical, Inc., Chicago, IL]; and MTI PhotoScreener [PhotoScreener, Inc., Palm Beach, FL]). The change in sensitivity that resulted from combining a test of eye alignment with a test of refractive error or VA was determined with specificity set at 90% and 94%. Results Among the 4040 children, 157 were identified as having strabismus. For screening tests conducted by eye care professionals, the addition of a unilateral cover test to a test of refraction generally resulted in a statistically significant increase (range, 15%–25%) in detection of strabismus. For screening tests administered by trained lay screeners, the addition of Stereo Smile II to SureSight resulted in a statistically significant increase (21%) in sensitivity for detection of strabismus. Conclusions The most efficient and low-cost ways to achieve a statistically significant increase in sensitivity for detection of strabismus were by combining the unilateral cover test with the autorefractor (Retinomax) administered by eye care professionals and by combining Stereo Smile II with SureSight administered by trained lay screeners. The decision of whether to include a test of alignment should be based on the screening program’s goals (e.g., targeted visual conditions) and resources. PMID:17591881

  14. Deep learning for media analysis in defense scenariosan evaluation of an open source framework for object detection in intelligence related image sets

    DTIC Science & Technology

    2017-06-01

    Training time statistics from Jones’ thesis. . . . . . . . . . . . . . 15 Table 2.2 Evaluation runtime statistics from Camp’s thesis for a single image. 17...Table 2.3 Training and evaluation runtime statistics from Sharpe’s thesis. . . 19 Table 2.4 Sharpe’s screenshot detector results for combinations of...training resources available and time required for each algorithm Jones [15] tested. Table 2.1. Training time statistics from Jones’ [15] thesis. Algorithm

  15. A signal detection method for temporal variation of adverse effect with vaccine adverse event reporting system data.

    PubMed

    Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong

    2017-07-05

    To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.

  16. Data processing of qualitative results from an interlaboratory comparison for the detection of “Flavescence dorée” phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology

    PubMed Central

    Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of “Flavescence dorée” (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes’ theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods. PMID:28384335

  17. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    PubMed

    Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods.

  18. Detecting changes in dynamic and complex acoustic environments

    PubMed Central

    Boubenec, Yves; Lawlor, Jennifer; Górska, Urszula; Shamma, Shihab; Englitz, Bernhard

    2017-01-01

    Natural sounds such as wind or rain, are characterized by the statistical occurrence of their constituents. Despite their complexity, listeners readily detect changes in these contexts. We here address the neural basis of statistical decision-making using a combination of psychophysics, EEG and modelling. In a texture-based, change-detection paradigm, human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. The potential's amplitude scaled with the duration of pre-change exposure, suggesting a time-dependent decision threshold. Auditory cortex-related potentials showed no response to the change. A dual timescale, statistical estimation model accounted for subjects' performance. Furthermore, a decision-augmented auditory cortex model accounted for performance and reaction times, suggesting that the primary cortical representation requires little post-processing to enable change-detection in complex acoustic environments. DOI: http://dx.doi.org/10.7554/eLife.24910.001 PMID:28262095

  19. Distinguishing Positive Selection From Neutral Evolution: Boosting the Performance of Summary Statistics

    PubMed Central

    Lin, Kao; Li, Haipeng; Schlötterer, Christian; Futschik, Andreas

    2011-01-01

    Summary statistics are widely used in population genetics, but they suffer from the drawback that no simple sufficient summary statistic exists, which captures all information required to distinguish different evolutionary hypotheses. Here, we apply boosting, a recent statistical method that combines simple classification rules to maximize their joint predictive performance. We show that our implementation of boosting has a high power to detect selective sweeps. Demographic events, such as bottlenecks, do not result in a large excess of false positives. A comparison to other neutrality tests shows that our boosting implementation performs well compared to other neutrality tests. Furthermore, we evaluated the relative contribution of different summary statistics to the identification of selection and found that for recent sweeps integrated haplotype homozygosity is very informative whereas older sweeps are better detected by Tajima's π. Overall, Watterson's θ was found to contribute the most information for distinguishing between bottlenecks and selection. PMID:21041556

  20. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  1. Adaptive Locally Optimum Processing for Interference Suppression from Communication and Undersea Surveillance Signals

    DTIC Science & Technology

    1994-07-01

    1993. "Analysis of the 1730-1732. Track - Before - Detect Approach to Target Detection using Pixel Statistics", to appear in IEEE Transactions Scholz, J...large surveillance arrays. One approach to combining energy in different spatial cells is track - before - detect . References to examples appear in the next... track - before - detect problem. The results obtained are not expected to depend strongly on model details. In particular, the structure of the tracking

  2. Study on chemotherapeutic sensitizing effect of nimotuzumab on different human esophageal squamous carcinoma cells.

    PubMed

    Yang, Xiaoyu; Ji, Yinghua; Kang, Xiaochun; Chen, Meiling; Kou, Weizheng; Jin, Cailing; Lu, Ping

    2016-02-01

    Esophageal cancer is one of the leading causes of mortality worldwide. Although, surgery, radio- and chemotherapy are used to treat the disease, the identification of new drugs is crucial to increase the curative effect. The aim of the present study was to examine the chemotherapeutic sensitizing effect of nimotuzumab (h-R3) and cisplatin cytotoxic drugs cisplatin (DDP) and 5-fluorouracil (5-FU) on esophageal carcinoma cells with two different epidermal growth factor receptor (EGFR) expressions. The expression of EGFR was detected in the human EC1 or EC9706 esophageal squamous cell carcinoma cell line using immunohistochemistry. The inhibitory effect of DDP and 5-FU alone or combined with h-R3 on EC1 or EC9706 cell proliferation was detected using an MTT assay. Flow cytometry and the TUNEL assay were used to determine the effect of single or combined drug treatment on cell apoptosis. The results showed that the expression of EGFR was low in EC1 cells but high in EC9706 cells. The inhibitory effect of the single use of h-R3 on EC1 or EC9706 cell proliferation was decreased. The inhibitory effect between single use of h-R3 alone and combined use of the chemotherapy drugs showed no statistically significant difference (P>0.05) on the EC1 cell growth rate, but showed a statistically significant difference (a=0.05) on EC9706 cell growth rate. The results detected by flow cytometry and TUNEL assay showed that the difference between single use of h-R3 alone and the control group was statistically significant with regard to the EC1 apoptosis rate effect (P<0.05), but not statistically significant for EC9706 (P>0.05). However, statistically significant differences were identified in the apoptotic rate of EC9706 cells between the h-R3 combined chemotherapy group and single chemotherapy group (P<0.05), but not on in the EC1 chemotherapy group (P>0.05). In conclusion, the sensitization effect of h-R3 on chemotherapy drugs is associated with the expression level of EGFR in EC1 or EC9706 cells. The cell killing effect of the combined use of h-R3 with DDP and 5-FU showed no obvious synergistic effect compared to the single-drug group, but only an additive effect.

  3. Action detection by double hierarchical multi-structure space-time statistical matching model

    NASA Astrophysics Data System (ADS)

    Han, Jing; Zhu, Junwei; Cui, Yiyin; Bai, Lianfa; Yue, Jiang

    2018-03-01

    Aimed at the complex information in videos and low detection efficiency, an actions detection model based on neighboring Gaussian structure and 3D LARK features is put forward. We exploit a double hierarchical multi-structure space-time statistical matching model (DMSM) in temporal action localization. First, a neighboring Gaussian structure is presented to describe the multi-scale structural relationship. Then, a space-time statistical matching method is proposed to achieve two similarity matrices on both large and small scales, which combines double hierarchical structural constraints in model by both the neighboring Gaussian structure and the 3D LARK local structure. Finally, the double hierarchical similarity is fused and analyzed to detect actions. Besides, the multi-scale composite template extends the model application into multi-view. Experimental results of DMSM on the complex visual tracker benchmark data sets and THUMOS 2014 data sets show the promising performance. Compared with other state-of-the-art algorithm, DMSM achieves superior performances.

  4. Action detection by double hierarchical multi-structure space–time statistical matching model

    NASA Astrophysics Data System (ADS)

    Han, Jing; Zhu, Junwei; Cui, Yiyin; Bai, Lianfa; Yue, Jiang

    2018-06-01

    Aimed at the complex information in videos and low detection efficiency, an actions detection model based on neighboring Gaussian structure and 3D LARK features is put forward. We exploit a double hierarchical multi-structure space-time statistical matching model (DMSM) in temporal action localization. First, a neighboring Gaussian structure is presented to describe the multi-scale structural relationship. Then, a space-time statistical matching method is proposed to achieve two similarity matrices on both large and small scales, which combines double hierarchical structural constraints in model by both the neighboring Gaussian structure and the 3D LARK local structure. Finally, the double hierarchical similarity is fused and analyzed to detect actions. Besides, the multi-scale composite template extends the model application into multi-view. Experimental results of DMSM on the complex visual tracker benchmark data sets and THUMOS 2014 data sets show the promising performance. Compared with other state-of-the-art algorithm, DMSM achieves superior performances.

  5. Combining Statistical and Geometric Features for Colonic Polyp Detection in CTC Based on Multiple Kernel Learning

    PubMed Central

    Wang, Shijun; Yao, Jianhua; Petrick, Nicholas; Summers, Ronald M.

    2010-01-01

    Colon cancer is the second leading cause of cancer-related deaths in the United States. Computed tomographic colonography (CTC) combined with a computer aided detection system provides a feasible approach for improving colonic polyps detection and increasing the use of CTC for colon cancer screening. To distinguish true polyps from false positives, various features extracted from polyp candidates have been proposed. Most of these traditional features try to capture the shape information of polyp candidates or neighborhood knowledge about the surrounding structures (fold, colon wall, etc.). In this paper, we propose a new set of shape descriptors for polyp candidates based on statistical curvature information. These features called histograms of curvature features are rotation, translation and scale invariant and can be treated as complementing existing feature set. Then in order to make full use of the traditional geometric features (defined as group A) and the new statistical features (group B) which are highly heterogeneous, we employed a multiple kernel learning method based on semi-definite programming to learn an optimized classification kernel from the two groups of features. We conducted leave-one-patient-out test on a CTC dataset which contained scans from 66 patients. Experimental results show that a support vector machine (SVM) based on the combined feature set and the semi-definite optimization kernel achieved higher FROC performance compared to SVMs using the two groups of features separately. At a false positive per scan rate of 5, the sensitivity of the SVM using the combined features improved from 0.77 (Group A) and 0.73 (Group B) to 0.83 (p ≤ 0.01). PMID:20953299

  6. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  7. A Database of Computer Attacks for the Evaluation of Intrusion Detection Systems

    DTIC Science & Technology

    1999-06-01

    administrator whenever a system binary file (such as the ps, login , or ls program) is modified. Normal users have no legitimate reason to alter these files...development of EMERALD [46], which combines statistical anomaly detection from NIDES with signature verification. Specification-based intrusion detection...the creation of a single host that can act as many hosts. Daemons that provide network services—including telnetd, ftpd, and login — display banners

  8. Fast microcalcification detection in ultrasound images using image enhancement and threshold adjacency statistics

    NASA Astrophysics Data System (ADS)

    Cho, Baek Hwan; Chang, Chuho; Lee, Jong-Ha; Ko, Eun Young; Seong, Yeong Kyeong; Woo, Kyoung-Gu

    2013-02-01

    The existence of microcalcifications (MCs) is an important marker of malignancy in breast cancer. In spite of the benefits in mass detection for dense breasts, ultrasonography is believed that it might not reliably detect MCs. For computer aided diagnosis systems, however, accurate detection of MCs has the possibility of improving the performance in both Breast Imaging-Reporting and Data System (BI-RADS) lexicon description for calcifications and malignancy classification. We propose a new efficient and effective method for MC detection using image enhancement and threshold adjacency statistics (TAS). The main idea of TAS is to threshold an image and to count the number of white pixels with a given number of adjacent white pixels. Our contribution is to adopt TAS features and apply image enhancement to facilitate MC detection in ultrasound images. We employed fuzzy logic, tophat filter, and texture filter to enhance images for MCs. Using a total of 591 images, the classification accuracy of the proposed method in MC detection showed 82.75%, which is comparable to that of Haralick texture features (81.38%). When combined, the performance was as high as 85.11%. In addition, our method also showed the ability in mass classification when combined with existing features. In conclusion, the proposed method exploiting image enhancement and TAS features has the potential to deal with MC detection in ultrasound images efficiently and extend to the real-time localization and visualization of MCs.

  9. Mammographic enhancement with combining local statistical measures and sliding band filter for improved mass segmentation in mammograms

    NASA Astrophysics Data System (ADS)

    Kim, Dae Hoe; Choi, Jae Young; Choi, Seon Hyeong; Ro, Yong Man

    2012-03-01

    In this study, a novel mammogram enhancement solution is proposed, aiming to improve the quality of subsequent mass segmentation in mammograms. It has been widely accepted that characteristics of masses are usually hyper-dense or uniform density with respect to its background. Also, their core parts are likely to have high-intensity values while the values of intensity tend to be decreased as the distance to core parts increases. Based on the aforementioned observations, we develop a new and effective mammogram enhancement method by combining local statistical measurements and Sliding Band Filtering (SBF). By effectively combining local statistical measurements and SBF, we are able to improve the contrast of the bright and smooth regions (which represent potential mass regions), as well as, at the same time, the regions where their surrounding gradients are converging to the centers of regions of interest. In this study, 89 mammograms were collected from the public MAIS database (DB) to demonstrate the effectiveness of the proposed enhancement solution in terms of improving mass segmentation. As for a segmentation method, widely used contour-based segmentation approach was employed. The contour-based method in conjunction with the proposed enhancement solution achieved overall detection accuracy of 92.4% with a total of 85 correct cases. On the other hand, without using our enhancement solution, overall detection accuracy of the contour-based method was only 78.3%. In addition, experimental results demonstrated the feasibility of our enhancement solution for the purpose of improving detection accuracy on mammograms containing dense parenchymal patterns.

  10. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    PubMed

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  11. A statistical assessment of pesticide pollution in surface waters using environmental monitoring data: Chlorpyrifos in Central Valley, California.

    PubMed

    Wang, Dan; Singhasemanon, Nan; Goh, Kean S

    2016-11-15

    Pesticides are routinely monitored in surface waters and resultant data are analyzed to assess whether their uses will damage aquatic eco-systems. However, the utility of the monitoring data is limited because of the insufficiency in the temporal and spatial sampling coverage and the inability to detect and quantify trace concentrations. This study developed a novel assessment procedure that addresses those limitations by combining 1) statistical methods capable of extracting information from concentrations below changing detection limits, 2) statistical resampling techniques that account for uncertainties rooted in the non-detects and insufficient/irregular sampling coverage, and 3) multiple lines of evidence that improve confidence in the final conclusion. This procedure was demonstrated by an assessment on chlorpyrifos monitoring data in surface waters of California's Central Valley (2005-2013). We detected a significant downward trend in the concentrations, which cannot be observed by commonly-used statistical approaches. We assessed that the aquatic risk was low using a probabilistic method that works with non-detects and has the ability to differentiate indicator groups with varying sensitivity. In addition, we showed that the frequency of exceedance over ambient aquatic life water quality criteria was affected by pesticide use, precipitation and irrigation demand in certain periods anteceding the water sampling events. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. A Survey of Insider Attack Detection Research

    DTIC Science & Technology

    2008-08-25

    modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination

  13. A Population Study of Gaseous Exoplanets

    NASA Astrophysics Data System (ADS)

    Tsiaras, A.; Waldmann, I. P.; Zingales, T.; Rocchetto, M.; Morello, G.; Damiano, M.; Karpouzas, K.; Tinetti, G.; McKemmish, L. K.; Tennyson, J.; Yurchenko, S. N.

    2018-04-01

    We present here the analysis of 30 gaseous extrasolar planets, with temperatures between 600 and 2400 K and radii between 0.35 and 1.9 R Jup. The quality of the HST/WFC3 spatially scanned data combined with our specialized analysis tools allow us to study the largest and most self-consistent sample of exoplanetary transmission spectra to date and examine the collective behavior of warm and hot gaseous planets rather than isolated case studies. We define a new metric, the Atmospheric Detectability Index (ADI) to evaluate the statistical significance of an atmospheric detection and find statistically significant atmospheres in around 16 planets out of the 30 analyzed. For most of the Jupiters in our sample, we find the detectability of their atmospheres to be dependent on the planetary radius but not on the planetary mass. This indicates that planetary gravity plays a secondary role in the state of gaseous planetary atmospheres. We detect the presence of water vapour in all of the statistically detectable atmospheres, and we cannot rule out its presence in the atmospheres of the others. In addition, TiO and/or VO signatures are detected with 4σ confidence in WASP-76 b, and they are most likely present in WASP-121 b. We find no correlation between expected signal-to-noise and atmospheric detectability for most targets. This has important implications for future large-scale surveys.

  14. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... are highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  15. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  16. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  17. A comparative analysis of chaotic particle swarm optimizations for detecting single nucleotide polymorphism barcodes.

    PubMed

    Chuang, Li-Yeh; Moi, Sin-Hua; Lin, Yu-Da; Yang, Cheng-Hong

    2016-10-01

    Evolutionary algorithms could overcome the computational limitations for the statistical evaluation of large datasets for high-order single nucleotide polymorphism (SNP) barcodes. Previous studies have proposed several chaotic particle swarm optimization (CPSO) methods to detect SNP barcodes for disease analysis (e.g., for breast cancer and chronic diseases). This work evaluated additional chaotic maps combined with the particle swarm optimization (PSO) method to detect SNP barcodes using a high-dimensional dataset. Nine chaotic maps were used to improve PSO method results and compared the searching ability amongst all CPSO methods. The XOR and ZZ disease models were used to compare all chaotic maps combined with PSO method. Efficacy evaluations of CPSO methods were based on statistical values from the chi-square test (χ 2 ). The results showed that chaotic maps could improve the searching ability of PSO method when population are trapped in the local optimum. The minor allele frequency (MAF) indicated that, amongst all CPSO methods, the numbers of SNPs, sample size, and the highest χ 2 value in all datasets were found in the Sinai chaotic map combined with PSO method. We used the simple linear regression results of the gbest values in all generations to compare the all methods. Sinai chaotic map combined with PSO method provided the highest β values (β≥0.32 in XOR disease model and β≥0.04 in ZZ disease model) and the significant p-value (p-value<0.001 in both the XOR and ZZ disease models). The Sinai chaotic map was found to effectively enhance the fitness values (χ 2 ) of PSO method, indicating that the Sinai chaotic map combined with PSO method is more effective at detecting potential SNP barcodes in both the XOR and ZZ disease models. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Real-time detection of deoxyribonucleic acid bases via their negative differential conductance signature.

    PubMed

    Dragoman, D; Dragoman, M

    2009-08-01

    In this Brief Report, we present a method for the real-time detection of the bases of the deoxyribonucleic acid using their signatures in negative differential conductance measurements. The present methods of electronic detection of deoxyribonucleic acid bases are based on a statistical analysis because the electrical currents of the four bases are weak and do not differ significantly from one base to another. In contrast, we analyze a device that combines the accumulated knowledge in nanopore and scanning tunneling detection and which is able to provide very distinctive electronic signatures for the four bases.

  19. Formalized Conflicts Detection Based on the Analysis of Multiple Emails: An Approach Combining Statistics and Ontologies

    NASA Astrophysics Data System (ADS)

    Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel

    In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.

  20. Predictive inference for best linear combination of biomarkers subject to limits of detection.

    PubMed

    Coolen-Maturi, Tahani

    2017-08-15

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine, machine learning and credit scoring. The receiver operating characteristic (ROC) curve is a useful tool to assess the ability of a diagnostic test to discriminate between two classes or groups. In practice, multiple diagnostic tests or biomarkers are combined to improve diagnostic accuracy. Often, biomarker measurements are undetectable either below or above the so-called limits of detection (LoD). In this paper, nonparametric predictive inference (NPI) for best linear combination of two or more biomarkers subject to limits of detection is presented. NPI is a frequentist statistical method that is explicitly aimed at using few modelling assumptions, enabled through the use of lower and upper probabilities to quantify uncertainty. The NPI lower and upper bounds for the ROC curve subject to limits of detection are derived, where the objective function to maximize is the area under the ROC curve. In addition, the paper discusses the effect of restriction on the linear combination's coefficients on the analysis. Examples are provided to illustrate the proposed method. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Fusion of ultrasonic and infrared signatures for personnel detection by a mobile robot

    NASA Astrophysics Data System (ADS)

    Carroll, Matthew S.; Meng, Min; Cadwallender, William K.

    1992-04-01

    Passive Infrared sensors used for intrusion detection, especially those used on mobile robots, are vulnerable to false alarms caused by clutter objects such as radiators, steam pipes, windows, etc., as well as deliberately caused false alarms caused by decoy objects. To overcome these sources of false alarms, we are now combining thermal and ultrasonic signals, the results being a more robust system for detecting personnel. Our paper will discuss the fusion strategies used for combining sensor information. Our first strategy uses a statistical classifier using features such as the sonar cross-section, the received thermal energy, and ultrasonic range. Our second strategy uses s 3-layered neural classifier trained by backpropagation. The probability of correct classification and the false alarm rate for both strategies will be presented in the paper.

  2. Optimal combinations of acute phase proteins for detecting infectious disease in pigs.

    PubMed

    Heegaard, Peter M H; Stockmarr, Anders; Piñeiro, Matilde; Carpintero, Rakel; Lampreave, Fermin; Campbell, Fiona M; Eckersall, P David; Toussaint, Mathilda J M; Gruys, Erik; Sorensen, Nanna Skall

    2011-03-17

    The acute phase protein (APP) response is an early systemic sign of disease, detected as substantial changes in APP serum concentrations and most disease states involving inflammatory reactions give rise to APP responses. To obtain a detailed picture of the general utility of porcine APPs to detect any disease with an inflammatory component seven porcine APPs were analysed in serum sampled at regular intervals in six different experimental challenge groups of pigs, including three bacterial (Actinobacillus pleuropneumoniae, Streptococcus suis, Mycoplasma hyosynoviae), one parasitic (Toxoplasma gondii) and one viral (porcine respiratory and reproductive syndrome virus) infection and one aseptic inflammation. Immunochemical analyses of seven APPs, four positive (C-reactive protein (CRP), haptoglobin (Hp), pig major acute phase protein (pigMAP) and serum amyloid A (SAA)) and three negative (albumin, transthyretin, and apolipoprotein A1 (apoA1)) were performed in the more than 400 serum samples constituting the serum panel. This was followed by advanced statistical treatment of the data using a multi-step procedure which included defining cut-off values and calculating detection probabilities for single APPs and for APP combinations. Combinations of APPs allowed the detection of disease more sensitively than any individual APP and the best three-protein combinations were CRP, apoA1, pigMAP and CRP, apoA1, Hp, respectively, closely followed by the two-protein combinations CRP, pigMAP and apoA1, pigMAP, respectively. For the practical use of such combinations, methodology is described for establishing individual APP threshold values, above which, for any APP in the combination, ongoing infection/inflammation is indicated.

  3. Atmospheric pollution measurement by optical cross correlation methods - A concept

    NASA Technical Reports Server (NTRS)

    Fisher, M. J.; Krause, F. R.

    1971-01-01

    Method combines standard spectroscopy with statistical cross correlation analysis of two narrow light beams for remote sensing to detect foreign matter of given particulate size and consistency. Method is applicable in studies of generation and motion of clouds, nuclear debris, ozone, and radiation belts.

  4. Combining diffusion-weighted MRI with Gd-EOB-DTPA-enhanced MRI improves the detection of colorectal liver metastases.

    PubMed

    Koh, D-M; Collins, D J; Wallace, T; Chau, I; Riddell, A M

    2012-07-01

    To compare the diagnostic accuracy of gadolinium-ethoxybenzyl-diethylenetriaminepentaacetic acid (Gd-EOB-DTPA)-enhanced MRI, diffusion-weighted MRI (DW-MRI) and a combination of both techniques for the detection of colorectal hepatic metastases. 72 patients with suspected colorectal liver metastases underwent Gd-EOB-DTPA MRI and DW-MRI. Images were retrospectively reviewed with unenhanced T(1) and T(2) weighted images as Gd-EOB-DTPA image set, DW-MRI image set and combined image set by two independent radiologists. Each lesion detected was scored for size, location and likelihood of metastasis, and compared with surgery and follow-up imaging. Diagnostic accuracy was compared using receiver operating characteristics and interobserver agreement by kappa statistics. 417 lesions (310 metastases, 107 benign) were found in 72 patients. For both readers, diagnostic accuracy using the combined image set was higher [area under the curve (Az)=0.96, 0.97] than Gd-EOB-DTPA image set (Az=0.86, 0.89) or DW-MRI image set (Az=0.93, 0.92). Using combined image set improved identification of liver metastases compared with Gd-EOB-DTPA image set (p<0.001) or DW-MRI image set (p<0.001). There was very good interobserver agreement for lesion classification (κ=0.81-0.88). Combining DW-MRI with Gd-EOB-DTPA-enhanced T(1) weighted MRI significantly improved the detection of colorectal liver metastases.

  5. Infrared maritime target detection using the high order statistic filtering in fractional Fourier domain

    NASA Astrophysics Data System (ADS)

    Zhou, Anran; Xie, Weixin; Pei, Jihong

    2018-06-01

    Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.

  6. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    PubMed

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).

  7. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    USGS Publications Warehouse

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially different results and/or computational instability. However, when only fixed effects are of interest, the survey package (svyglm and svyolr) may be suitable for a model-assisted analysis for trend. We provide possible directions for future research into combined analysis for ordinal and continuous vital sign indictors.

  8. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning.

    PubMed

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-06-17

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.

  9. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning

    PubMed Central

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-01-01

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults. PMID:27322273

  10. [Clinical validation of multiple biomarkers suspension array technology for ovarian cancer].

    PubMed

    Zhao, B B; Yang, Z J; Wang, Q; Pan, Z M; Zhang, W; Li, L

    2017-01-25

    Objective: To investigates the diagnostic value of combined detection serum CCL18, CXCL1 antigen, C1D, TM4SF1, FXR1, TIZ IgG autoantibody by suspension array for ovarian cancer. Methods: Suspension array was used to detect CCL18, CXCL1 antigen, C1D, TM4SF1, FXR1, TIZ IgG autoantibody in 120 cases of healthy women, 204 cases of patients with benign pelvic tumors, 119 cases of pelvic malignant tumor patients, and 40 cases with breast cancer, lung cancer oroliver cancer, respectively. Constructed diagnosis model of combined detection six biomarkers for diagnosis of ovarian malignant tumor. Constructed diagnosis model of combined detection autoantibodies to diagnose epithelial ovarian cancer. Analysed the value of detecting six biomarkers for diagnosis of ovarian malignant tumor and detecting autoantibodies for diagnosis of epithelial ovarian cancer. Analysed diagnostic value of detecting six biomarkers to diagnose stage Ⅰ and Ⅱepithelial ovarian cancer. Compared diagnostic value of detecting six biomarkers in diagnosis of tissue types and pathologic grading with that of CA(125). Results: Model of combined detecting six biomarkers to diagnose ovarian malignant tumor was logit ( P ) =-11.151+0.008×C1D+0.011×TM4SF1+0.011×TIZ-0.008×FXR1+0.021×CCL18+0.200×CXCL1. Model of combined detection autoantibodies to diagnose epithelial ovarian cancer was logit ( P ) =-5.137+0.013×C1D+0.014×TM4SF1+0.060×TIZ-0.060×FXR1. Sensitivity and specificity of detecting six biomarker to diagnose ovarian malignant tumor was 90.6% and 98.7%. Sensitivity and specificity of detecting autoantibodies to diagnose epithelial ovarian cancer was 75.8% and 96.7%. Combined detection for six biomarkers to diagnose serous and mucinous ovarian cancer was statistically no better than those of CA(125) ( P =0.196 and P =0.602, respectively); there was significantly difference in diagnosis of ovarian cancer ( P =0.023), and there was no significantly difference in diagnosis of different pathological grading ( P =0.089 and P =0.169, respectively). Conclusions: Constructing diagnosis model of combined detection six biomarker to diagnose ovarian malignant tumor and constructed diagnosis model of combined detectionautoantibodies to diagnose epithelial ovarian cancer. Combined detection six biomarkers to diagnose serous and mucinous ovarian tumors is better than that of CA(125).

  11. A New Challenge for Compression Algorithms: Genetic Sequences.

    ERIC Educational Resources Information Center

    Grumbach, Stephane; Tahi, Fariza

    1994-01-01

    Analyzes the properties of genetic sequences that cause the failure of classical algorithms used for data compression. A lossless algorithm, which compresses the information contained in DNA and RNA sequences by detecting regularities such as palindromes, is presented. This algorithm combines substitutional and statistical methods and appears to…

  12. Bootstrap inversion technique for atmospheric trace gas source detection and quantification using long open-path laser measurements

    NASA Astrophysics Data System (ADS)

    Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep

    2018-03-01

    Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The results of the synthetic and field data testing show that the new observing system and statistical approach greatly decreases the incidence of false alarms (that is, wrongly identifying a well site to be leaking) compared with the same tests that do not use the NZMB approach and therefore offers increased leak detection and sizing capabilities.

  13. Surveying Europe's Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA.

    PubMed

    Vörös, Judit; Márton, Orsolya; Schmidt, Benedikt R; Gál, Júlia Tünde; Jelić, Dušan

    2017-01-01

    In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence.

  14. Comparative analysis on the selection of number of clusters in community detection

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro; Kabashima, Yoshiyuki

    2018-02-01

    We conduct a comparative analysis on various estimates of the number of clusters in community detection. An exhaustive comparison requires testing of all possible combinations of frameworks, algorithms, and assessment criteria. In this paper we focus on the framework based on a stochastic block model, and investigate the performance of greedy algorithms, statistical inference, and spectral methods. For the assessment criteria, we consider modularity, map equation, Bethe free energy, prediction errors, and isolated eigenvalues. From the analysis, the tendency of overfit and underfit that the assessment criteria and algorithms have becomes apparent. In addition, we propose that the alluvial diagram is a suitable tool to visualize statistical inference results and can be useful to determine the number of clusters.

  15. Allelic-based gene-gene interaction associated with quantitative traits.

    PubMed

    Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M

    2009-05-01

    Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.

  16. The score statistic of the LD-lod analysis: detecting linkage adaptive to linkage disequilibrium.

    PubMed

    Huang, J; Jiang, Y

    2001-01-01

    We study the properties of a modified lod score method for testing linkage that incorporates linkage disequilibrium (LD-lod). By examination of its score statistic, we show that the LD-lod score method adaptively combines two sources of information: (a) the IBD sharing score which is informative for linkage regardless of the existence of LD and (b) the contrast between allele-specific IBD sharing scores which is informative for linkage only in the presence of LD. We also consider the connection between the LD-lod score method and the transmission-disequilibrium test (TDT) for triad data and the mean test for affected sib pair (ASP) data. We show that, for triad data, the recessive LD-lod test is asymptotically equivalent to the TDT; and for ASP data, it is an adaptive combination of the TDT and the ASP mean test. We demonstrate that the LD-lod score method has relatively good statistical efficiency in comparison with the ASP mean test and the TDT for a broad range of LD and the genetic models considered in this report. Therefore, the LD-lod score method is an interesting approach for detecting linkage when the extent of LD is unknown, such as in a genome-wide screen with a dense set of genetic markers. Copyright 2001 S. Karger AG, Basel

  17. Statistical significance of combinatorial regulations

    PubMed Central

    Terada, Aika; Okada-Hatakeyama, Mariko; Tsuda, Koji; Sese, Jun

    2013-01-01

    More than three transcription factors often work together to enable cells to respond to various signals. The detection of combinatorial regulation by multiple transcription factors, however, is not only computationally nontrivial but also extremely unlikely because of multiple testing correction. The exponential growth in the number of tests forces us to set a strict limit on the maximum arity. Here, we propose an efficient branch-and-bound algorithm called the “limitless arity multiple-testing procedure” (LAMP) to count the exact number of testable combinations and calibrate the Bonferroni factor to the smallest possible value. LAMP lists significant combinations without any limit, whereas the family-wise error rate is rigorously controlled under the threshold. In the human breast cancer transcriptome, LAMP discovered statistically significant combinations of as many as eight binding motifs. This method may contribute to uncover pathways regulated in a coordinated fashion and find hidden associations in heterogeneous data. PMID:23882073

  18. Comparing distinct ground-based lightning location networks covering the Netherlands

    NASA Astrophysics Data System (ADS)

    de Vos, Lotte; Leijnse, Hidde; Schmeits, Maurice; Beekhuis, Hans; Poelman, Dieter; Evers, Läslo; Smets, Pieter

    2015-04-01

    Lightning can be detected using a ground-based sensor network. The Royal Netherlands Meteorological Institute (KNMI) monitors lightning activity in the Netherlands with the so-called FLITS-system; a network combining SAFIR-type sensors. This makes use of Very High Frequency (VHF) as well as Low Frequency (LF) sensors. KNMI has recently decided to replace FLITS by data from a sub-continental network operated by Météorage which makes use of LF sensors only (KNMI Lightning Detection Network, or KLDN). KLDN is compared to the FLITS system, as well as Met Office's long-range Arrival Time Difference (ATDnet), which measures Very Low Frequency (VLF). Special focus lies on the ability to detect Cloud to Ground (CG) and Cloud to Cloud (CC) lightning in the Netherlands. Relative detection efficiency of individual flashes and lightning activity in a more general sense are calculated over a period of almost 5 years. Additionally, the detection efficiency of each system is compared to a ground-truth that is constructed from flashes that are detected by both of the other datasets. Finally, infrasound data is used as a fourth lightning data source for several case studies. Relative performance is found to vary strongly with location and time. As expected, it is found that FLITS detects significantly more CC lightning (because of the strong aptitude of VHF antennas to detect CC), though KLDN and ATDnet detect more CG lightning. We analyze statistics computed over the entire 5-year period, where we look at CG as well as total lightning (CC and CG combined). Statistics that are considered are the Probability of Detection (POD) and the so-called Lightning Activity Detection (LAD). POD is defined as the percentage of reference flashes the system detects compared to the total detections in the reference. LAD is defined as the fraction of system recordings of one or more flashes in predefined area boxes over a certain time period given the fact that the reference detects at least one flash, compared to the total recordings in the reference dataset. The reference for these statistics is taken to be either another dataset, or a dataset consisting of flashes detected by two datasets. Extreme thunderstorm case evaluation shows that the weather alert criterion for severe thunderstorm is reached by FLITS when this is not the case in KLDN and ATD, suggesting the need for KNMI to modify that weather alert criterion when using KLDN.

  19. Concept and Analysis of a Satellite for Space-Based Radio Detection of Ultra-High Energy Cosmic Rays

    NASA Astrophysics Data System (ADS)

    Romero-Wolf, Andrew; Gorham, P.; Booth, J.; Chen, P.; Duren, R. M.; Liewer, K.; Nam, J.; Saltzberg, D.; Schoorlemmer, H.; Wissel, S.; Zairfian, P.

    2014-01-01

    We present a concept for on-orbit radio detection of ultra-high energy cosmic rays (UHECRs) that has the potential to provide collection rates of ~100 events per year for energies above 10^20 eV. The synoptic wideband orbiting radio detector (SWORD) mission's high event statistics at these energies combined with the pointing capabilities of a space-borne antenna array could enable charged particle astronomy. The detector concept is based on ANITA's successful detection UHECRs where the geosynchrotron radio signal produced by the extended air shower is reflected off the Earth's surface and detected in flight.

  20. Epithelial cancer detection by oblique-incidence optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Garcia-Uribe, Alejandro; Balareddy, Karthik C.; Zou, Jun; Wang, Kenneth K.; Duvic, Madeleine; Wang, Lihong V.

    2009-02-01

    This paper presents a study on non-invasive detection of two common epithelial cancers (skin and esophagus) based on oblique incidence diffuse reflectance spectroscopy (OIDRS). An OIDRS measurement system, which combines fiber optics and MEMS technologies, was developed. In our pilot studies, a total number of 137 cases have been measured in-vivo for skin cancer detection and a total number of 20 biopsy samples have been measured ex-vivo for esophageal cancer detection. To automatically differentiate the cancerous cases from benign ones, a statistical software classification program was also developed. An overall classification accuracy of 90% and 100% has been achieved for skin and esophageal cancer classification, respectively.

  1. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises

    PubMed Central

    Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise. PMID:28692667

  2. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises.

    PubMed

    Jin, Qiyu; Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise.

  3. Vision-based method for detecting driver drowsiness and distraction in driver monitoring system

    NASA Astrophysics Data System (ADS)

    Jo, Jaeik; Lee, Sung Joo; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie

    2011-12-01

    Most driver-monitoring systems have attempted to detect either driver drowsiness or distraction, although both factors should be considered for accident prevention. Therefore, we propose a new driver-monitoring method considering both factors. We make the following contributions. First, if the driver is looking ahead, drowsiness detection is performed; otherwise, distraction detection is performed. Thus, the computational cost and eye-detection error can be reduced. Second, we propose a new eye-detection algorithm that combines adaptive boosting, adaptive template matching, and blob detection with eye validation, thereby reducing the eye-detection error and processing time significantly, which is hardly achievable using a single method. Third, to enhance eye-detection accuracy, eye validation is applied after initial eye detection, using a support vector machine based on appearance features obtained by principal component analysis (PCA) and linear discriminant analysis (LDA). Fourth, we propose a novel eye state-detection algorithm that combines appearance features obtained using PCA and LDA, with statistical features such as the sparseness and kurtosis of the histogram from the horizontal edge image of the eye. Experimental results showed that the detection accuracies of the eye region and eye states were 99 and 97%, respectively. Both driver drowsiness and distraction were detected with a success rate of 98%.

  4. A prospective evaluation of early detection biomarkers for ovarian cancer in the European EPIC cohort

    PubMed Central

    Terry, Kathryn L.; Schock, Helena; Fortner, Renée T.; Hüsing, Anika; Fichorova, Raina N.; Yamamoto, Hidemi S.; Vitonis, Allison F.; Johnson, Theron; Overvad, Kim; Tjønneland, Anne; Boutron-Ruault, Marie-Christine; Mesrine, Sylvie; Severi, Gianluca; Dossus, Laure; Rinaldi, Sabina; Boeing, Heiner; Benetou, Vassiliki; Lagiou, Pagona; Trichopoulou, Antonia; Krogh, Vittorio; Kuhn, Elisabetta; Panico, Salvatore; Bueno-de-Mesquita, H. Bas; Onland-Moret, N. Charlotte; Peeters, Petra H.; Gram, Inger Torhild; Weiderpass, Elisabete; Duell, Eric J.; Sanchez, Maria-Jose; Ardanaz, Eva; Etxezarreta, Nerea; Navarro, Carmen; Idahl, Annika; Lundin, Eva; Jirström, Karin; Manjer, Jonas; Wareham, Nicholas J.; Khaw, Kay-Tee; Byrne, Karl Smith; Travis, Ruth C.; Gunter, Marc J.; Merritt, Melissa A.; Riboli, Elio; Cramer, Daniel W.; Kaaks, Rudolf

    2016-01-01

    Purpose About 60% of ovarian cancers are diagnosed at late stage, when 5-year survival is less than 30% in contrast to 90% for local disease. This has prompted search for early detection biomarkers. For initial testing, specimens taken months or years before ovarian cancer diagnosis are the best source of information to evaluate early detection biomarkers. Here we evaluate the most promising ovarian cancer screening biomarkers in prospectively collected samples from the European Prospective Investigation into Cancer and Nutrition study. Experimental Design We measured CA125, HE4, CA72.4 and CA15.3 in 810 invasive epithelial ovarian cancer cases and 1,939 controls. We calculated the sensitivity at 95% and 98% specificity as well as Area under the Receiver Operator Curve (C-statistic) for each marker individually and in combination. Additionally, we evaluated marker performance by stage at diagnosis and time between blood draw and diagnosis. Results We observed the best discrimination between cases and controls within six months of diagnosis for CA125 (C-statistic=0.92), then HE4 (0.84), CA72.4 (0.77), and CA15.3 (0.73). Marker performance declined with longer time between blood draw and diagnosis and for earlier staged disease. However, assessment of discriminatory ability at early stage was limited by small numbers. Combinations of markers performed modestly, but significantly better than any single marker. Conclusions CA125 remains the single best marker for the early detection of invasive epithelial ovarian cancer, but can be slightly improved by combining with other markers. Identifying novel markers for ovarian cancer will require studies including larger numbers of early stage cases. PMID:27060155

  5. A Prospective Evaluation of Early Detection Biomarkers for Ovarian Cancer in the European EPIC Cohort.

    PubMed

    Terry, Kathryn L; Schock, Helena; Fortner, Renée T; Hüsing, Anika; Fichorova, Raina N; Yamamoto, Hidemi S; Vitonis, Allison F; Johnson, Theron; Overvad, Kim; Tjønneland, Anne; Boutron-Ruault, Marie-Christine; Mesrine, Sylvie; Severi, Gianluca; Dossus, Laure; Rinaldi, Sabina; Boeing, Heiner; Benetou, Vassiliki; Lagiou, Pagona; Trichopoulou, Antonia; Krogh, Vittorio; Kuhn, Elisabetta; Panico, Salvatore; Bueno-de-Mesquita, H Bas; Onland-Moret, N Charlotte; Peeters, Petra H; Gram, Inger Torhild; Weiderpass, Elisabete; Duell, Eric J; Sanchez, Maria-Jose; Ardanaz, Eva; Etxezarreta, Nerea; Navarro, Carmen; Idahl, Annika; Lundin, Eva; Jirström, Karin; Manjer, Jonas; Wareham, Nicholas J; Khaw, Kay-Tee; Byrne, Karl Smith; Travis, Ruth C; Gunter, Marc J; Merritt, Melissa A; Riboli, Elio; Cramer, Daniel W; Kaaks, Rudolf

    2016-09-15

    About 60% of ovarian cancers are diagnosed at late stage, when 5-year survival is less than 30% in contrast to 90% for local disease. This has prompted search for early detection biomarkers. For initial testing, specimens taken months or years before ovarian cancer diagnosis are the best source of information to evaluate early detection biomarkers. Here we evaluate the most promising ovarian cancer screening biomarkers in prospectively collected samples from the European Prospective Investigation into Cancer and Nutrition study. We measured CA125, HE4, CA72.4, and CA15.3 in 810 invasive epithelial ovarian cancer cases and 1,939 controls. We calculated the sensitivity at 95% and 98% specificity as well as area under the receiver operator curve (C-statistic) for each marker individually and in combination. In addition, we evaluated marker performance by stage at diagnosis and time between blood draw and diagnosis. We observed the best discrimination between cases and controls within 6 months of diagnosis for CA125 (C-statistic = 0.92), then HE4 (0.84), CA72.4 (0.77), and CA15.3 (0.73). Marker performance declined with longer time between blood draw and diagnosis and for earlier staged disease. However, assessment of discriminatory ability at early stage was limited by small numbers. Combinations of markers performed modestly, but significantly better than any single marker. CA125 remains the single best marker for the early detection of invasive epithelial ovarian cancer, but can be slightly improved by combining with other markers. Identifying novel markers for ovarian cancer will require studies including larger numbers of early-stage cases. Clin Cancer Res; 22(18); 4664-75. ©2016 AACRSee related commentary by Skates, p. 4542. ©2016 American Association for Cancer Research.

  6. Contrast limited adaptive histogram equalization image processing to improve the detection of simulated spiculations in dense mammograms.

    PubMed

    Pisano, E D; Zong, S; Hemminger, B M; DeLuca, M; Johnston, R E; Muller, K; Braeuning, M P; Pizer, S M

    1998-11-01

    The purpose of this project was to determine whether Contrast Limited Adaptive Histogram Equalization (CLAHE) improves detection of simulated spiculations in dense mammograms. Lines simulating the appearance of spiculations, a common marker of malignancy when visualized with masses, were embedded in dense mammograms digitized at 50 micron pixels, 12 bits deep. Film images with no CLAHE applied were compared to film images with nine different combinations of clip levels and region sizes applied. A simulated spiculation was embedded in a background of dense breast tissue, with the orientation of the spiculation varied. The key variables involved in each trial included the orientation of the spiculation, contrast level of the spiculation and the CLAHE settings applied to the image. Combining the 10 CLAHE conditions, 4 contrast levels and 4 orientations gave 160 combinations. The trials were constructed by pairing 160 combinations of key variables with 40 backgrounds. Twenty student observers were asked to detect the orientation of the spiculation in the image. There was a statistically significant improvement in detection performance for spiculations with CLAHE over unenhanced images when the region size was set at 32 with a clip level of 2, and when the region size was set at 32 with a clip level of 4. The selected CLAHE settings should be tested in the clinic with digital mammograms to determine whether detection of spiculations associated with masses detected at mammography can be improved.

  7. The application of signal detection theory to optics

    NASA Technical Reports Server (NTRS)

    Helstrom, C. W.

    1971-01-01

    The restoration of images focused on a photosensitive surface is treated from the standpoint of maximum likelihood estimation, taking into account the Poisson distributions of the observed data, which are the numbers of photoelectrons from various elements of the surface. A detector of an image focused on such a surface utilizes a certain linear combination of those numbers as the optimum detection statistic. Methods for calculating the false alarm and detection probabilities are proposed. It is shown that measuring noncommuting observables in an ideal quantum receiver cannot yield a lower Bayes cost than that attainable by a system measuring only commuting observables.

  8. Recovering incomplete data using Statistical Multiple Imputations (SMI): a case study in environmental chemistry.

    PubMed

    Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D

    2011-10-15

    This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. kruX: matrix-based non-parametric eQTL discovery.

    PubMed

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  10. Testing for Additivity in Chemical Mixtures Using a Fixed-Ratio Ray Design and Statistical Equivalence Testing Methods

    EPA Science Inventory

    Fixed-ratio ray designs have been used for detecting and characterizing interactions of large numbers of chemicals in combination. Single chemical dose-response data are used to predict an “additivity curve” along an environmentally relevant ray. A “mixture curve” is estimated fr...

  11. The image recognition based on neural network and Bayesian decision

    NASA Astrophysics Data System (ADS)

    Wang, Chugege

    2018-04-01

    The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.

  12. Detection of foreign substances in food using thermography

    NASA Astrophysics Data System (ADS)

    Meinlschmidt, Peter; Maergner, Volker

    2002-03-01

    This paper gives a short introduction into the possibility of detecting foreign bodies in food by using IR thermography. The first results shown for combinations of cherries and chocolate and berries contaminated with leaves, stalks, pedicel and thorns could be easily evaluated manually. Therefore the differing emissivity coefficients or the different heat conductivities and/or capacities are used for differentiation. Applying pulse thermography, first heat conductivity measurements of different food materials are performed. Calculating the contrast of possible food / contaminant combinations shows the difficulty of differentiating certain materials. A possible automatic evaluation for raisins contaminated with wooden sticks and almonds blended with stones could be shown. The power of special adapted algorithms using statistical or morphological analysis is shown to distinguish the foreign bodies from the foodstuff.

  13. Combining multiple ChIP-seq peak detection systems using combinatorial fusion.

    PubMed

    Schweikert, Christina; Brown, Stuart; Tang, Zuojian; Smith, Phillip R; Hsu, D Frank

    2012-01-01

    Due to the recent rapid development in ChIP-seq technologies, which uses high-throughput next-generation DNA sequencing to identify the targets of Chromatin Immunoprecipitation, there is an increasing amount of sequencing data being generated that provides us with greater opportunity to analyze genome-wide protein-DNA interactions. In particular, we are interested in evaluating and enhancing computational and statistical techniques for locating protein binding sites. Many peak detection systems have been developed; in this study, we utilize the following six: CisGenome, MACS, PeakSeq, QuEST, SISSRs, and TRLocator. We define two methods to merge and rescore the regions of two peak detection systems and analyze the performance based on average precision and coverage of transcription start sites. The results indicate that ChIP-seq peak detection can be improved by fusion using score or rank combination. Our method of combination and fusion analysis would provide a means for generic assessment of available technologies and systems and assist researchers in choosing an appropriate system (or fusion method) for analyzing ChIP-seq data. This analysis offers an alternate approach for increasing true positive rates, while decreasing false positive rates and hence improving the ChIP-seq peak identification process.

  14. CMDR based differential evolution identifies the epistatic interaction in genome-wide association studies.

    PubMed

    Yang, Cheng-Hong; Chuang, Li-Yeh; Lin, Yu-Da

    2017-08-01

    Detecting epistatic interactions in genome-wide association studies (GWAS) is a computational challenge. Such huge numbers of single-nucleotide polymorphism (SNP) combinations limit the some of the powerful algorithms to be applied to detect the potential epistasis in large-scale SNP datasets. We propose a new algorithm which combines the differential evolution (DE) algorithm with a classification based multifactor-dimensionality reduction (CMDR), termed DECMDR. DECMDR uses the CMDR as a fitness measure to evaluate values of solutions in DE process for scanning the potential statistical epistasis in GWAS. The results indicated that DECMDR outperforms the existing algorithms in terms of detection success rate by the large simulation and real data obtained from the Wellcome Trust Case Control Consortium. For running time comparison, DECMDR can efficient to apply the CMDR to detect the significant association between cases and controls amongst all possible SNP combinations in GWAS. DECMDR is freely available at https://goo.gl/p9sLuJ . chuang@isu.edu.tw or e0955767257@yahoo.com.tw. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Gas chimney detection based on improving the performance of combined multilayer perceptron and support vector classifier

    NASA Astrophysics Data System (ADS)

    Hashemi, H.; Tax, D. M. J.; Duin, R. P. W.; Javaherian, A.; de Groot, P.

    2008-11-01

    Seismic object detection is a relatively new field in which 3-D bodies are visualized and spatial relationships between objects of different origins are studied in order to extract geologic information. In this paper, we propose a method for finding an optimal classifier with the help of a statistical feature ranking technique and combining different classifiers. The method, which has general applicability, is demonstrated here on a gas chimney detection problem. First, we evaluate a set of input seismic attributes extracted at locations labeled by a human expert using regularized discriminant analysis (RDA). In order to find the RDA score for each seismic attribute, forward and backward search strategies are used. Subsequently, two non-linear classifiers: multilayer perceptron (MLP) and support vector classifier (SVC) are run on the ranked seismic attributes. Finally, to capitalize on the intrinsic differences between both classifiers, the MLP and SVC results are combined using logical rules of maximum, minimum and mean. The proposed method optimizes the ranked feature space size and yields the lowest classification error in the final combined result. We will show that the logical minimum reveals gas chimneys that exhibit both the softness of MLP and the resolution of SVC classifiers.

  16. A Novel Fiber Optic Based Surveillance System for Prevention of Pipeline Integrity Threats.

    PubMed

    Tejedor, Javier; Macias-Guarasa, Javier; Martins, Hugo F; Piote, Daniel; Pastor-Graells, Juan; Martin-Lopez, Sonia; Corredera, Pedro; Gonzalez-Herraez, Miguel

    2017-02-12

    This paper presents a novel surveillance system aimed at the detection and classification of threats in the vicinity of a long gas pipeline. The sensing system is based on phase-sensitive optical time domain reflectometry ( ϕ -OTDR) technology for signal acquisition and pattern recognition strategies for threat identification. The proposal incorporates contextual information at the feature level and applies a system combination strategy for pattern classification. The contextual information at the feature level is based on the tandem approach (using feature representations produced by discriminatively-trained multi-layer perceptrons) by employing feature vectors that spread different temporal contexts. The system combination strategy is based on a posterior combination of likelihoods computed from different pattern classification processes. The system operates in two different modes: (1) machine + activity identification, which recognizes the activity being carried out by a certain machine, and (2) threat detection, aimed at detecting threats no matter what the real activity being conducted is. In comparison with a previous system based on the same rigorous experimental setup, the results show that the system combination from the contextual feature information improves the results for each individual class in both operational modes, as well as the overall classification accuracy, with statistically-significant improvements.

  17. Detecting Visually Observable Disease Symptoms from Faces.

    PubMed

    Wang, Kuan; Luo, Jiebo

    2016-12-01

    Recent years have witnessed an increasing interest in the application of machine learning to clinical informatics and healthcare systems. A significant amount of research has been done on healthcare systems based on supervised learning. In this study, we present a generalized solution to detect visually observable symptoms on faces using semi-supervised anomaly detection combined with machine vision algorithms. We rely on the disease-related statistical facts to detect abnormalities and classify them into multiple categories to narrow down the possible medical reasons of detecting. Our method is in contrast with most existing approaches, which are limited by the availability of labeled training data required for supervised learning, and therefore offers the major advantage of flagging any unusual and visually observable symptoms.

  18. Evaluating the utility of syndromic surveillance algorithms for screening to detect potentially clonal hospital infection outbreaks

    PubMed Central

    Talbot, Thomas R; Schaffner, William; Bloch, Karen C; Daniels, Titus L; Miller, Randolph A

    2011-01-01

    Objective The authors evaluated algorithms commonly used in syndromic surveillance for use as screening tools to detect potentially clonal outbreaks for review by infection control practitioners. Design Study phase 1 applied four aberrancy detection algorithms (CUSUM, EWMA, space-time scan statistic, and WSARE) to retrospective microbiologic culture data, producing a list of past candidate outbreak clusters. In phase 2, four infectious disease physicians categorized the phase 1 algorithm-identified clusters to ascertain algorithm performance. In phase 3, project members combined the algorithms to create a unified screening system and conducted a retrospective pilot evaluation. Measurements The study calculated recall and precision for each algorithm, and created precision-recall curves for various methods of combining the algorithms into a unified screening tool. Results Individual algorithm recall and precision ranged from 0.21 to 0.31 and from 0.053 to 0.29, respectively. Few candidate outbreak clusters were identified by more than one algorithm. The best method of combining the algorithms yielded an area under the precision-recall curve of 0.553. The phase 3 combined system detected all infection control-confirmed outbreaks during the retrospective evaluation period. Limitations Lack of phase 2 reviewers' agreement indicates that subjective expert review was an imperfect gold standard. Less conservative filtering of culture results and alternate parameter selection for each algorithm might have improved algorithm performance. Conclusion Hospital outbreak detection presents different challenges than traditional syndromic surveillance. Nevertheless, algorithms developed for syndromic surveillance have potential to form the basis of a combined system that might perform clinically useful hospital outbreak screening. PMID:21606134

  19. Improved spatial regression analysis of diffusion tensor imaging for lesion detection during longitudinal progression of multiple sclerosis in individual subjects

    NASA Astrophysics Data System (ADS)

    Liu, Bilan; Qiu, Xing; Zhu, Tong; Tian, Wei; Hu, Rui; Ekholm, Sven; Schifitto, Giovanni; Zhong, Jianhui

    2016-03-01

    Subject-specific longitudinal DTI study is vital for investigation of pathological changes of lesions and disease evolution. Spatial Regression Analysis of Diffusion tensor imaging (SPREAD) is a non-parametric permutation-based statistical framework that combines spatial regression and resampling techniques to achieve effective detection of localized longitudinal diffusion changes within the whole brain at individual level without a priori hypotheses. However, boundary blurring and dislocation limit its sensitivity, especially towards detecting lesions of irregular shapes. In the present study, we propose an improved SPREAD (dubbed improved SPREAD, or iSPREAD) method by incorporating a three-dimensional (3D) nonlinear anisotropic diffusion filtering method, which provides edge-preserving image smoothing through a nonlinear scale space approach. The statistical inference based on iSPREAD was evaluated and compared with the original SPREAD method using both simulated and in vivo human brain data. Results demonstrated that the sensitivity and accuracy of the SPREAD method has been improved substantially by adapting nonlinear anisotropic filtering. iSPREAD identifies subject-specific longitudinal changes in the brain with improved sensitivity, accuracy, and enhanced statistical power, especially when the spatial correlation is heterogeneous among neighboring image pixels in DTI.

  20. Detecting anomalies in CMB maps: a new method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less

  1. Association analysis of multiple traits by an approach of combining P values.

    PubMed

    Chen, Lili; Wang, Yong; Zhou, Yajing

    2018-03-01

    Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.

  2. Technical evaluation of Virtual Touch™ tissue quantification and elastography in benign and malignant breast tumors

    PubMed Central

    JIANG, QUAN; ZHANG, YUAN; CHEN, JIAN; ZHANG, YUN-XIAO; HE, ZHU

    2014-01-01

    The aim of this study was to investigate the diagnostic value of the Virtual Touch™ tissue quantification (VTQ) and elastosonography technologies in benign and malignant breast tumors. Routine preoperative ultrasound, elastosonography and VTQ examinations were performed on 86 patients with breast lesions. The elastosonography score and VTQ speed grouping of each lesion were measured and compared with the pathological findings. The difference in the elastosonography score between the benign and malignant breast tumors was statistically significant (P<0.05). The detection rate for an elastosonography score of 1–3 points in benign tumors was 68.09% and that for an elastosonography score of 4–5 points in malignant tumors was 82.05%. The difference in VTQ speed values between the benign and malignant tumors was also statistically significant (P<0.05). In addition, the diagnostic accuracy of conventional ultrasound, elastosonography, VTQ technology and the combined methods showed statistically significant differences (P<0.05). The use of the three technologies in combination significantly improved the diagnostic accuracy to 91.86%. In conclusion, the combination of conventional ultrasound, elastosonography and VTQ technology can significantly improve accuracy in the diagnosis of breast cancer. PMID:25187797

  3. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    NASA Astrophysics Data System (ADS)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as "targeted change detection". Based on a one-class classifier "Support Vector Domain Description (SVDD)", a novel algorithm named "Three-layer SVDD Fusion (TLSF)" is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  4. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data

    PubMed Central

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J.; Intarapanich, Apichart; Tongsima, Sissades

    2017-01-01

    Background Biochemical methods are available for enriching 5′ ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5′ ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. Results We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5′ ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5′ ends than TSSAR. In general, the transcript 5′ ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. Conclusion ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5′ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER). PMID:28542466

  5. Surveying Europe’s Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA

    PubMed Central

    Márton, Orsolya; Schmidt, Benedikt R.; Gál, Júlia Tünde; Jelić, Dušan

    2017-01-01

    In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence. PMID:28129383

  6. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video.

    PubMed

    Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.

  7. Detecting signals of drug-drug interactions in a spontaneous reports database.

    PubMed

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-10-01

    The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug-drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. The additive model correctly identified all four known DDIs by giving a statistically significant (P < 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P < 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P = 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model.

  8. Detecting signals of drug–drug interactions in a spontaneous reports database

    PubMed Central

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-01-01

    Aims The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug–drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Methods Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. Results The additive model correctly identified all four known DDIs by giving a statistically significant (P< 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P< 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P= 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. Conclusions The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model. PMID:17506784

  9. Aircraft target detection algorithm based on high resolution spaceborne SAR imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Hao, Mengxi; Zhang, Cong; Su, Xiaojing

    2018-03-01

    In this paper, an image classification algorithm for airport area is proposed, which based on the statistical features of synthetic aperture radar (SAR) images and the spatial information of pixels. The algorithm combines Gamma mixture model and MRF. The algorithm using Gamma mixture model to obtain the initial classification result. Pixel space correlation based on the classification results are optimized by the MRF technique. Additionally, morphology methods are employed to extract airport (ROI) region where the suspected aircraft target samples are clarified to reduce the false alarm and increase the detection performance. Finally, this paper presents the plane target detection, which have been verified by simulation test.

  10. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Improving Glaucoma Detection Using Spatially Correspondent Clusters of Damage and by Combining Standard Automated Perimetry and Optical Coherence Tomography

    PubMed Central

    Raza, Ali S.; Zhang, Xian; De Moraes, Carlos G. V.; Reisman, Charles A.; Liebmann, Jeffrey M.; Ritch, Robert; Hood, Donald C.

    2014-01-01

    Purpose. To improve the detection of glaucoma, techniques for assessing local patterns of damage and for combining structure and function were developed. Methods. Standard automated perimetry (SAP) and frequency-domain optical coherence tomography (fdOCT) data, consisting of macular retinal ganglion cell plus inner plexiform layer (mRGCPL) as well as macular and optic disc retinal nerve fiber layer (mRNFL and dRNFL) thicknesses, were collected from 52 eyes of 52 healthy controls and 156 eyes of 96 glaucoma suspects and patients. In addition to generating simple global metrics, SAP and fdOCT data were searched for contiguous clusters of abnormal points and converted to a continuous metric (pcc). The pcc metric, along with simpler methods, was used to combine the information from the SAP and fdOCT. The performance of different methods was assessed using the area under receiver operator characteristic curves (AROC scores). Results. The pcc metric performed better than simple global measures for both the fdOCT and SAP. The best combined structure-function metric (mRGCPL&SAP pcc, AROC = 0.868 ± 0.032) was better (statistically significant) than the best metrics for independent measures of structure and function. When SAP was used as part of the inclusion and exclusion criteria, AROC scores increased for all metrics, including the best combined structure-function metric (AROC = 0.975 ± 0.014). Conclusions. A combined structure-function metric improved the detection of glaucomatous eyes. Overall, the primary sources of value-added for glaucoma detection stem from the continuous cluster search (the pcc), the mRGCPL data, and the combination of structure and function. PMID:24408977

  12. [Effects of Buzhong Yiqi decoction on expression of Bad, NF-κB, caspase-9, Survivin, and mTOR in nude mice with A549/DDP transplantation tumors].

    PubMed

    Liu, Ya-Li; Yi, Jia-Li; Liu, Chun-Ying

    2017-02-01

    This study was aimed to explore the effects of Buzhong Yiqi decoction on the expression levels of Bad, NF-κB, caspase-9, Survivin, and mTOR in nude mice with A549/DDP transplantation tumors.Sixty BALB/C mice were randomly divided into blank control group, tumor-bearing control group, cisplatin group and Buzhong Yiqi decoction of high, medium and low doses+cisplatin groups (hereinafter referred to as the high,medium and low combined groups). A549/DDP cells (concentration of 5×106 cells/mL)were cultured and inoculated in various groups, then the tumor-forming situations were observed. Corresponding treatment was given in all groups. Fourteen days later, immunohistochemistry and Real-time PCR methods were used to detect the expression levels of Bad, NF-κB, caspase-9, Survivin, mTOR protein and mRNA in tumors.Results showed that Buzhong Yiqi decoction combined with cisplatin could reduce the volume of transplanted tumors, and there was significant difference between medium combined group and high combined group(P<0.05). As compared with the tumor-bearing control group, the expression levels of Bad, NF-κB, Survivin and mTOR were significantly reduced in medium and high combined groups(P<0.05); the protein and mRNA expression levels of caspase-9 were gradually increased in medium combined and high combined groups(P<0.05), with statistical difference with tumor-bearing control group(P<0.05). There were statistical difference in mRNA expression of Bad, NF-κB and caspase-9 between medium combined group, high combined group and cisplatin group, low-combined group, tumor-bearing control group(P<0.05), but there was no statistical difference between cisplatin group, low-combined group, and tumor-bearing control group. In addition, there was no statistical difference between medium combined group and high combined group in protein and mRNA expression levels of various factors. Experimental results showed that Buzhong Yiqi decoction combined with cisplatin can inhibit the growth of A549/DDP transplanted tumors, and the mechanism may be associated with regulating Bad, NF-κB, caspase-9, Survivin, and mTOR levels as well as promoting apoptosis. Copyright© by the Chinese Pharmaceutical Association.

  13. ICPD-a new peak detection algorithm for LC/MS.

    PubMed

    Zhang, Jianqiu; Haskins, William

    2010-12-01

    The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods.

  14. Testing chirality of primordial gravitational waves with Planck and future CMB data: no hope from angular power spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerbino, Martina; Gruppuso, Alessandro; Natoli, Paolo

    We use the 2015 Planck likelihood in combination with the Bicep2/Keck likelihood (BKP and BK14) to constrain the chirality, χ, of primordial gravitational waves in a scale-invariant scenario. In this framework, the parameter χ enters theory always coupled to the tensor-to-scalar ratio, r , e.g. in combination of the form χ ⋅ r . Thus, the capability to detect χ critically depends on the value of r . We find that with present data sets χ is de facto unconstrained. We also provide forecasts for χ from future CMB experiments, including COrE+, exploring several fiducial values of r . Wemore » find that the current limit on r is tight enough to disfavor a neat detection of χ. For example, in the unlikely case in which r ∼0.1(0.05), the maximal chirality case, i.e. χ = ±1, could be detected with a significance of ∼2.5(1.5)σ at best. We conclude that the two-point statistics at the basis of CMB likelihood functions is currently unable to constrain chirality and may only provide weak limits on χ in the most optimistic scenarios. Hence, it is crucial to investigate the use of other observables, e.g. provided by higher order statistics, to constrain these kinds of parity violating theories with the CMB.« less

  15. kruX: matrix-based non-parametric eQTL discovery

    PubMed Central

    2014-01-01

    Background The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. Results We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. Conclusion kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com. PMID:24423115

  16. Competition can lead to unexpected patterns in tropical ant communities

    NASA Astrophysics Data System (ADS)

    Ellwood, M. D. Farnon; Blüthgen, Nico; Fayle, Tom M.; Foster, William A.; Menzel, Florian

    2016-08-01

    Ecological communities are structured by competitive, predatory, mutualistic and parasitic interactions combined with chance events. Separating deterministic from stochastic processes is possible, but finding statistical evidence for specific biological interactions is challenging. We attempt to solve this problem for ant communities nesting in epiphytic bird's nest ferns (Asplenium nidus) in Borneo's lowland rainforest. By recording the frequencies with which each and every single ant species occurred together, we were able to test statistically for patterns associated with interspecific competition. We found evidence for competition, but the resulting co-occurrence pattern was the opposite of what we expected. Rather than detecting species segregation-the classical hallmark of competition-we found species aggregation. Moreover, our approach of testing individual pairwise interactions mostly revealed spatially positive rather than negative associations. Significant negative interactions were only detected among large ants, and among species of the subfamily Ponerinae. Remarkably, the results from this study, and from a corroborating analysis of ant communities known to be structured by competition, suggest that competition within the ants leads to species aggregation rather than segregation. We believe this unexpected result is linked with the displacement of species following asymmetric competition. We conclude that analysing co-occurrence frequencies across complete species assemblages, separately for each species, and for each unique pairwise combination of species, represents a subtle yet powerful way of detecting structure and compartmentalisation in ecological communities.

  17. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  18. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less

  19. Modeling forest biomass and growth: Coupling long-term inventory and LiDAR data

    Treesearch

    Chad Babcock; Andrew O. Finley; Bruce D. Cook; Aaron Weiskittel; Christopher W. Woodall

    2016-01-01

    Combining spatially-explicit long-term forest inventory and remotely sensed information from Light Detection and Ranging (LiDAR) datasets through statistical models can be a powerful tool for predicting and mapping above-ground biomass (AGB) at a range of geographic scales. We present and examine a novel modeling approach to improve prediction of AGB and estimate AGB...

  20. Detecting trend on ecological river status - how to deal with short incomplete bioindicator time series? Methodological and operational issues

    NASA Astrophysics Data System (ADS)

    Cernesson, Flavie; Tournoud, Marie-George; Lalande, Nathalie

    2018-06-01

    Among the various parameters monitored in river monitoring networks, bioindicators provide very informative data. Analysing time variations in bioindicator data is tricky for water managers because the data sets are often short, irregular, and non-normally distributed. It is then a challenging methodological issue for scientists, as it is in Saône basin (30 000 km2, France) where, between 1998 and 2010, among 812 IBGN (French macroinvertebrate bioindicator) monitoring stations, only 71 time series have got more than 10 data values and were studied here. Combining various analytical tools (three parametric and non-parametric statistical tests plus a graphical analysis), 45 IBGN time series were classified as stationary and 26 as non-stationary (only one of which showing a degradation). Series from sampling stations located within the same hydroecoregion showed similar trends, while river size classes seemed to be non-significant to explain temporal trends. So, from a methodological point of view, combining statistical tests and graphical analysis is a relevant option when striving to improve trend detection. Moreover, it was possible to propose a way to summarise series in order to analyse links between ecological river quality indicators and land use stressors.

  1. Designing a risk-based surveillance program for Mycobacterium avium ssp. paratuberculosis in Norwegian dairy herds using multivariate statistical process control analysis.

    PubMed

    Whist, A C; Liland, K H; Jonsson, M E; Sæbø, S; Sviland, S; Østerås, O; Norström, M; Hopp, P

    2014-11-01

    Surveillance programs for animal diseases are critical to early disease detection and risk estimation and to documenting a population's disease status at a given time. The aim of this study was to describe a risk-based surveillance program for detecting Mycobacterium avium ssp. paratuberculosis (MAP) infection in Norwegian dairy cattle. The included risk factors for detecting MAP were purchase of cattle, combined cattle and goat farming, and location of the cattle farm in counties containing goats with MAP. The risk indicators included production data [culling of animals >3 yr of age, carcass conformation of animals >3 yr of age, milk production decrease in older lactating cows (lactations 3, 4, and 5)], and clinical data (diarrhea, enteritis, or both, in animals >3 yr of age). Except for combined cattle and goat farming and cattle farm location, all data were collected at the cow level and summarized at the herd level. Predefined risk factors and risk indicators were extracted from different national databases and combined in a multivariate statistical process control to obtain a risk assessment for each herd. The ordinary Hotelling's T(2) statistic was applied as a multivariate, standardized measure of difference between the current observed state and the average state of the risk factors for a given herd. To make the analysis more robust and adapt it to the slowly developing nature of MAP, monthly risk calculations were based on data accumulated during a 24-mo period. Monitoring of these variables was performed to identify outliers that may indicate deviance in one or more of the underlying processes. The highest-ranked herds were scattered all over Norway and clustered in high-density dairy cattle farm areas. The resulting rankings of herds are being used in the national surveillance program for MAP in 2014 to increase the sensitivity of the ongoing surveillance program in which 5 fecal samples for bacteriological examination are collected from 25 dairy herds. The use of multivariate statistical process control for selection of herds will be beneficial when a diagnostic test suitable for mass screening is available and validated on the Norwegian cattle population, thus making it possible to increase the number of sampled herds. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. An empirical strategy to detect bacterial transcript structure from directional RNA-seq transcriptome data.

    PubMed

    Wang, Yejun; MacKenzie, Keith D; White, Aaron P

    2015-05-07

    As sequencing costs are being lowered continuously, RNA-seq has gradually been adopted as the first choice for comparative transcriptome studies with bacteria. Unlike microarrays, RNA-seq can directly detect cDNA derived from mRNA transcripts at a single nucleotide resolution. Not only does this allow researchers to determine the absolute expression level of genes, but it also conveys information about transcript structure. Few automatic software tools have yet been established to investigate large-scale RNA-seq data for bacterial transcript structure analysis. In this study, 54 directional RNA-seq libraries from Salmonella serovar Typhimurium (S. Typhimurium) 14028s were examined for potential relationships between read mapping patterns and transcript structure. We developed an empirical method, combined with statistical tests, to automatically detect key transcript features, including transcriptional start sites (TSSs), transcriptional termination sites (TTSs) and operon organization. Using our method, we obtained 2,764 TSSs and 1,467 TTSs for 1331 and 844 different genes, respectively. Identification of TSSs facilitated further discrimination of 215 putative sigma 38 regulons and 863 potential sigma 70 regulons. Combining the TSSs and TTSs with intergenic distance and co-expression information, we comprehensively annotated the operon organization in S. Typhimurium 14028s. Our results show that directional RNA-seq can be used to detect transcriptional borders at an acceptable resolution of ±10-20 nucleotides. Technical limitations of the RNA-seq procedure may prevent single nucleotide resolution. The automatic transcript border detection methods, statistical models and operon organization pipeline that we have described could be widely applied to RNA-seq studies in other bacteria. Furthermore, the TSSs, TTSs, operons, promoters and unstranslated regions that we have defined for S. Typhimurium 14028s may constitute valuable resources that can be used for comparative analyses with other Salmonella serotypes.

  3. Detection of liver metastasis: is diffusion-weighted imaging needed in Gd-EOB-DTPA-enhanced MR imaging for evaluation of colorectal liver metastases?

    PubMed

    Tajima, Taku; Akahane, Masaaki; Takao, Hidemasa; Akai, Hiroyuki; Kiryu, Shigeru; Imamura, Hiroshi; Watanabe, Yasushi; Kokudo, Norihiro; Ohtomo, Kuni

    2012-10-01

    We compared diagnostic ability for detecting hepatic metastases between gadolinium ethoxy benzyl diethylenetriamine pentaacetic acid (Gd-EOB-DTPA)-enhanced magnetic resonance imaging (MRI) and diffusion-weighted imaging (DWI) on a 1.5-T system, and determined whether DWI is necessary in Gd-EOB-DTPA-enhanced MRI for diagnosing colorectal liver metastases. We assessed 29 consecutive prospectively enrolled patients with suspected metachronous colorectal liver metastases; all patients underwent surgery and had preoperative Gd-EOB-DTPA-enhanced MRI. Overall detection rate, sensitivity for detecting metastases and benign lesions, positive predictive value, and diagnostic accuracy (Az value) were compared among three image sets [unenhanced MRI (DWI set), Gd-EOB-DTPA-enhanced MRI excluding DWI (EOB set), and combined set]. Gd-EOB-DTPA-enhanced MRI yielded better overall detection rate (77.8-79.0 %) and sensitivity (87.1-89.4 %) for detecting metastases than the DWI set (55.9 % and 64.7 %, respectively) for one observer (P < 0.001). No statistically significant difference was seen between the EOB and combined sets, although several metastases were newly detected on additional DWI. Gd-EOB-DTPA-enhanced MRI yielded a better overall detection rate and higher sensitivity for detecting metastases compared with unenhanced MRI. Additional DWI may be able to reduce oversight of lesions in Gd-EOB-DTPA-enhanced 1.5-T MRI for detecting colorectal liver metastases.

  4. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  5. Effectiveness of Exercise and Local Steroid Injections for the Thoracolumbar Junction Syndrome (The Maigne’s Syndrome) Treatment

    PubMed Central

    Alptekin, Kerem; Örnek, Nurettin Irem; Aydın, Tuğba; Alkan, Mirsad; Toprak, Mehmet; A. Balcı, Leyla; Öncü Alptekin, Jülide

    2017-01-01

    Purpose: Patients diagnosed as thoracolumbar junction syndrome were divided into 3 treatment groups and the results of each modality were compared. Materials and Method: 30 Patients were included in the study with the definitive diagnosis of Maigne’s Syndrome. The first group received exercise therapy, the second group was treated with local steroid injections and the third group was the combination therapy group of both injection and exercise. Findings: 30 Patients were divided into 3 groups. Each group had 10 patients. The average age of the groups was detected to be 23.43 ± 3.75. A flattening was detected in 4 patients of the first group (40%), 6 patients of the second group (60%) and 4 patients of the third group (40%) during the lumbar lordosis. While the average difference of the VAS values was (2.80) as the lowest for the injection group before and after treatment at rest, the highest value (3.30) was observed in the combined treatment group. The results shown on the Oswestry scale of the first month difference (16.10), and the third month difference (22.40) were statistically better than the other groups in the combined treatment group. Results: As a result of this study, while in all three treatment groups in the Oswestry scale, VAS scores at rest or at movement during the regular controls before and after the treatment showed statistically significant difference; the best results were obtained in the group administered to the combined injection and exercise therapy. PMID:28694884

  6. Staging of colorectal liver metastases after preoperative chemotherapy. Diffusion-weighted imaging in combination with Gd-EOB-DTPA MRI sequences increases sensitivity and diagnostic accuracy.

    PubMed

    Macera, Annalisa; Lario, Chiara; Petracchini, Massimo; Gallo, Teresa; Regge, Daniele; Floriani, Irene; Ribero, Dario; Capussotti, Lorenzo; Cirillo, Stefano

    2013-03-01

    To compare the diagnostic accuracy and sensitivity of Gd-EOB-DTPA MRI and diffusion-weighted (DWI) imaging alone and in combination for detecting colorectal liver metastases in patients who had undergone preoperative chemotherapy. Thirty-two consecutive patients with a total of 166 liver lesions were retrospectively enrolled. Of the lesions, 144 (86.8 %) were metastatic at pathology. Three image sets (1, Gd-EOB-DTPA; 2, DWI; 3, combined Gd-EOB-DTPA and DWI) were independently reviewed by two observers. Statistical analysis was performed on a per-lesion basis. Evaluation of image set 1 correctly identified 127/166 lesions (accuracy 76.5 %; 95 % CI 69.3-82.7) and 106/144 metastases (sensitivity 73.6 %, 95 % CI 65.6-80.6). Evaluation of image set 2 correctly identified 108/166 (accuracy 65.1 %, 95 % CI 57.3-72.3) and 87/144 metastases (sensitivity of 60.4 %, 95 % CI 51.9-68.5). Evaluation of image set 3 correctly identified 148/166 (accuracy 89.2 %, 95 % CI 83.4-93.4) and 131/144 metastases (sensitivity 91 %, 95 % CI 85.1-95.1). Differences were statistically significant (P < 0.001). Notably, similar results were obtained analysing only small lesions (<1 cm). The combination of DWI with Gd-EOB-DTPA-enhanced MRI imaging significantly increases the diagnostic accuracy and sensitivity in patients with colorectal liver metastases treated with preoperative chemotherapy, and it is particularly effective in the detection of small lesions.

  7. Combining heterogeneous features for colonic polyp detection in CTC based on semi-definite programming

    NASA Astrophysics Data System (ADS)

    Wang, Shijun; Yao, Jianhua; Petrick, Nicholas A.; Summers, Ronald M.

    2009-02-01

    Colon cancer is the second leading cause of cancer-related deaths in the United States. Computed tomographic colonography (CTC) combined with a computer aided detection system provides a feasible combination for improving colonic polyps detection and increasing the use of CTC for colon cancer screening. To distinguish true polyps from false positives, various features extracted from polyp candidates have been proposed. Most of these features try to capture the shape information of polyp candidates or neighborhood knowledge about the surrounding structures (fold, colon wall, etc.). In this paper, we propose a new set of shape descriptors for polyp candidates based on statistical curvature information. These features, called histogram of curvature features, are rotation, translation and scale invariant and can be treated as complementing our existing feature set. Then in order to make full use of the traditional features (defined as group A) and the new features (group B) which are highly heterogeneous, we employed a multiple kernel learning method based on semi-definite programming to identify an optimized classification kernel based on the combined set of features. We did leave-one-patient-out test on a CTC dataset which contained scans from 50 patients (with 90 6-9mm polyp detections). Experimental results show that a support vector machine (SVM) based on the combined feature set and the semi-definite optimization kernel achieved higher FROC performance compared to SVMs using the two groups of features separately. At a false positive per patient rate of 7, the sensitivity on 6-9mm polyps using the combined features improved from 0.78 (Group A) and 0.73 (Group B) to 0.82 (p<=0.01).

  8. A Hybrid Approach to Detect Driver Drowsiness Utilizing Physiological Signals to Improve System Performance and Wearability.

    PubMed

    Awais, Muhammad; Badruddin, Nasreen; Drieberg, Micheal

    2017-08-31

    Driver drowsiness is a major cause of fatal accidents, injury, and property damage, and has become an area of substantial research attention in recent years. The present study proposes a method to detect drowsiness in drivers which integrates features of electrocardiography (ECG) and electroencephalography (EEG) to improve detection performance. The study measures differences between the alert and drowsy states from physiological data collected from 22 healthy subjects in a driving simulator-based study. A monotonous driving environment is used to induce drowsiness in the participants. Various time and frequency domain feature were extracted from EEG including time domain statistical descriptors, complexity measures and power spectral measures. Features extracted from the ECG signal included heart rate (HR) and heart rate variability (HRV), including low frequency (LF), high frequency (HF) and LF/HF ratio. Furthermore, subjective sleepiness scale is also assessed to study its relationship with drowsiness. We used paired t -tests to select only statistically significant features ( p < 0.05), that can differentiate between the alert and drowsy states effectively. Significant features of both modalities (EEG and ECG) are then combined to investigate the improvement in performance using support vector machine (SVM) classifier. The other main contribution of this paper is the study on channel reduction and its impact to the performance of detection. The proposed method demonstrated that combining EEG and ECG has improved the system's performance in discriminating between alert and drowsy states, instead of using them alone. Our channel reduction analysis revealed that an acceptable level of accuracy (80%) could be achieved by combining just two electrodes (one EEG and one ECG), indicating the feasibility of a system with improved wearability compared with existing systems involving many electrodes. Overall, our results demonstrate that the proposed method can be a viable solution for a practical driver drowsiness system that is both accurate and comfortable to wear.

  9. A Hybrid Approach to Detect Driver Drowsiness Utilizing Physiological Signals to Improve System Performance and Wearability

    PubMed Central

    Badruddin, Nasreen

    2017-01-01

    Driver drowsiness is a major cause of fatal accidents, injury, and property damage, and has become an area of substantial research attention in recent years. The present study proposes a method to detect drowsiness in drivers which integrates features of electrocardiography (ECG) and electroencephalography (EEG) to improve detection performance. The study measures differences between the alert and drowsy states from physiological data collected from 22 healthy subjects in a driving simulator-based study. A monotonous driving environment is used to induce drowsiness in the participants. Various time and frequency domain feature were extracted from EEG including time domain statistical descriptors, complexity measures and power spectral measures. Features extracted from the ECG signal included heart rate (HR) and heart rate variability (HRV), including low frequency (LF), high frequency (HF) and LF/HF ratio. Furthermore, subjective sleepiness scale is also assessed to study its relationship with drowsiness. We used paired t-tests to select only statistically significant features (p < 0.05), that can differentiate between the alert and drowsy states effectively. Significant features of both modalities (EEG and ECG) are then combined to investigate the improvement in performance using support vector machine (SVM) classifier. The other main contribution of this paper is the study on channel reduction and its impact to the performance of detection. The proposed method demonstrated that combining EEG and ECG has improved the system’s performance in discriminating between alert and drowsy states, instead of using them alone. Our channel reduction analysis revealed that an acceptable level of accuracy (80%) could be achieved by combining just two electrodes (one EEG and one ECG), indicating the feasibility of a system with improved wearability compared with existing systems involving many electrodes. Overall, our results demonstrate that the proposed method can be a viable solution for a practical driver drowsiness system that is both accurate and comfortable to wear. PMID:28858220

  10. Identification of Major Histocompatibility Complex-Regulated Body Odorants by Statistical Analysis of a Comparative Gas Chromatography/Mass Spectrometry Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willse, Alan R.; Belcher, Ann; Preti, George

    2005-04-15

    Gas chromatography (GC), combined with mass spectrometry (MS) detection, is a powerful analytical technique that can be used to separate, quantify, and identify volatile compounds in complex mixtures. This paper examines the application of GC-MS in a comparative experiment to identify volatiles that differ in concentration between two groups. A complex mixture might comprise several hundred or even thousands of volatile compounds. Because their number and location in a chromatogram generally are unknown, and because components overlap in populous chromatograms, the statistical problems offer significant challenges beyond traditional two-group screening procedures. We describe a statistical procedure to compare two-dimensional GC-MSmore » profiles between groups, which entails (1) signal processing: baseline correction and peak detection in single ion chromatograms; (2) aligning chromatograms in time; (3) normalizing differences in overall signal intensities; and (4) detecting chromatographic regions that differ between groups. Compared to existing approaches, the proposed method is robust to errors made at earlier stages of analysis, such as missed peaks or slightly misaligned chromatograms. To illustrate the method, we identify differences in GC-MS chromatograms of ether-extracted urine collected from two nearly identical inbred groups of mice, to investigate the relationship between odor and genetics of the major histocompatibility complex.« less

  11. A new statistical PCA-ICA algorithm for location of R-peaks in ECG.

    PubMed

    Chawla, M P S; Verma, H K; Kumar, Vinod

    2008-09-16

    The success of ICA to separate the independent components from the mixture depends on the properties of the electrocardiogram (ECG) recordings. This paper discusses some of the conditions of independent component analysis (ICA) that could affect the reliability of the separation and evaluation of issues related to the properties of the signals and number of sources. Principal component analysis (PCA) scatter plots are plotted to indicate the diagnostic features in the presence and absence of base-line wander in interpreting the ECG signals. In this analysis, a newly developed statistical algorithm by authors, based on the use of combined PCA-ICA for two correlated channels of 12-channel ECG data is proposed. ICA technique has been successfully implemented in identifying and removal of noise and artifacts from ECG signals. Cleaned ECG signals are obtained using statistical measures like kurtosis and variance of variance after ICA processing. This analysis also paper deals with the detection of QRS complexes in electrocardiograms using combined PCA-ICA algorithm. The efficacy of the combined PCA-ICA algorithm lies in the fact that the location of the R-peaks is bounded from above and below by the location of the cross-over points, hence none of the peaks are ignored or missed.

  12. Chemical entity recognition in patents by combining dictionary-based and statistical approaches

    PubMed Central

    Akhondi, Saber A.; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F.H.; Hettne, Kristina M.; van Mulligen, Erik M.; Kors, Jan A.

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small. Database URL: http://biosemantics.org/chemdner-patents PMID:27141091

  13. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that amore » decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.« less

  14. Detection of Dendritic Spines Using Wavelet Packet Entropy and Fuzzy Support Vector Machine.

    PubMed

    Wang, Shuihua; Li, Yang; Shao, Ying; Cattani, Carlo; Zhang, Yudong; Du, Sidan

    2017-01-01

    The morphology of dendritic spines is highly correlated with the neuron function. Therefore, it is of positive influence for the research of the dendritic spines. However, it is tried to manually label the spine types for statistical analysis. In this work, we proposed an approach based on the combination of wavelet contour analysis for the backbone detection, wavelet packet entropy, and fuzzy support vector machine for the spine classification. The experiments show that this approach is promising. The average detection accuracy of "MushRoom" achieves 97.3%, "Stubby" achieves 94.6%, and "Thin" achieves 97.2%. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Host galaxy identification for binary black hole mergers with long baseline gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Howell, E. J.; Chan, M. L.; Chu, Q.; Jones, D. H.; Heng, I. S.; Lee, H.-M.; Blair, D.; Degallaix, J.; Regimbau, T.; Miao, H.; Zhao, C.; Hendry, M.; Coward, D.; Messenger, C.; Ju, L.; Zhu, Z.-H.

    2018-03-01

    The detection of black hole binary coalescence events by Advanced LIGO allows the science benefits of future detectors to be evaluated. In this paper, we report the science benefits of one or two 8 km arm length detectors based on the doubling of key parameters in an Advanced LIGO-type detector, combined with realizable enhancements. It is shown that the total detection rate for sources similar to those already detected would increase to ˜ 103-105 per year. Within 0.4 Gpc, we find that around 10 of these events would be localizable to within ˜10-1 deg2. This is sufficient to make unique associations or to rule out a direct association with the brightest galaxies in optical surveys (at r-band magnitudes of 17 or above) or for deeper limits (down to r-band magnitudes of 20) yield statistically significant associations. The combination of angular resolution and event rate would benefit precision testing of formation models, cosmic evolution, and cosmological studies.

  16. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Using Peptide-Level Proteomics Data for Detecting Differentially Expressed Proteins.

    PubMed

    Suomi, Tomi; Corthals, Garry L; Nevalainen, Olli S; Elo, Laura L

    2015-11-06

    The expression of proteins can be quantified in high-throughput means using different types of mass spectrometers. In recent years, there have emerged label-free methods for determining protein abundance. Although the expression is initially measured at the peptide level, a common approach is to combine the peptide-level measurements into protein-level values before differential expression analysis. However, this simple combination is prone to inconsistencies between peptides and may lose valuable information. To this end, we introduce here a method for detecting differentially expressed proteins by combining peptide-level expression-change statistics. Using controlled spike-in experiments, we show that the approach of averaging peptide-level expression changes yields more accurate lists of differentially expressed proteins than does the conventional protein-level approach. This is particularly true when there are only few replicate samples or the differences between the sample groups are small. The proposed technique is implemented in the Bioconductor package PECA, and it can be downloaded from http://www.bioconductor.org.

  18. First trimester PAPP-A in the detection of non-Down syndrome aneuploidy.

    PubMed

    Ochshorn, Y; Kupferminc, M J; Wolman, I; Orr-Urtreger, A; Jaffa, A J; Yaron, Y

    2001-07-01

    Combined first trimester screening using pregnancy associated plasma protein-A (PAPP-A), free beta-human chorionic gonadotrophin, and nuchal translucency (NT), is currently accepted as probably the best combination for the detection of Down syndrome (DS). Current first trimester algorithms provide computed risks only for DS. However, low PAPP-A is also associated with other chromosome anomalies such as trisomy 13, 18, and sex chromosome aneuploidy. Thus, using currently available algorithms, some chromosome anomalies may not be detected. The purpose of the present study was to establish a low-end cut-off value for PAPP-A that would increase the detection rates for non-DS chromosome anomalies. The study included 1408 patients who underwent combined first trimester screening. To determine a low-end cut-off value for PAPP-A, a Receiver-Operator Characteristic (ROC) curve analysis was performed. In the entire study group there were 18 cases of chromosome anomalies (trisomy 21, 13, 18, sex chromosome anomalies), 14 of which were among screen-positive patients, a detection rate of 77.7% for all chromosome anomalies (95% CI: 55.7-99.7%). ROC curve analysis detected a statistically significant cut-off for PAPP-A at 0.25 MoM. If the definition of screen-positive were to also include patients with PAPP-A<0.25 MoM, the detection rate would increase to 88.8% for all chromosome anomalies (95% CI: 71.6-106%). This low cut-off value may be used until specific algorithms are implemented for non-Down syndrome aneuploidy. Copyright 2001 John Wiley & Sons, Ltd.

  19. Temporal asymmetry in precipitation time series and its influence on flow simulations in combined sewer systems

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Schütze, Manfred; Bárdossy, András

    2017-09-01

    A property of natural processes is temporal irreversibility. However, this property cannot be reflected by most statistics used to describe precipitation time series and, consequently, is not considered in most precipitation models. In this paper, a new statistic, the asymmetry measure, is introduced and applied to precipitation enabling to detect and quantify irreversibility. It is used to analyze two different data sets of Singapore and Germany. The data of both locations show a significant asymmetry for high temporal resolutions. The asymmetry is more pronounced for Singapore where the climate is dominated by convective precipitation events. The impact of irreversibility on applications is analyzed on two different hydrological sewer system models. The results show that the effect of the irreversibility can lead to biases in combined sewer overflow statistics. This bias is in the same order as the effect that can be achieved by real time control of sewer systems. Consequently, wrong conclusion can be drawn if synthetic time series are used for sewer systems if asymmetry is present, but not considered in precipitation modeling.

  20. Fault detection and diagnosis using neural network approaches

    NASA Technical Reports Server (NTRS)

    Kramer, Mark A.

    1992-01-01

    Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.

  1. Automatic detection of slight parameter changes associated to complex biomedical signals using multiresolution q-entropy1.

    PubMed

    Torres, M E; Añino, M M; Schlotthauer, G

    2003-12-01

    It is well known that, from a dynamical point of view, sudden variations in physiological parameters which govern certain diseases can cause qualitative changes in the dynamics of the corresponding physiological process. The purpose of this paper is to introduce a technique that allows the automated temporal localization of slight changes in a parameter of the law that governs the nonlinear dynamics of a given signal. This tool takes, from the multiresolution entropies, the ability to show these changes as statistical variations at each scale. These variations are held in the corresponding principal component. Appropriately combining these techniques with a statistical changes detector, a complexity change detection algorithm is obtained. The relevance of the approach, together with its robustness in the presence of moderate noise, is discussed in numerical simulations and the automatic detector is applied to real and simulated biological signals.

  2. Quantum correlation enhanced super-resolution localization microscopy enabled by a fibre bundle camera

    PubMed Central

    Israel, Yonatan; Tenne, Ron; Oron, Dan; Silberberg, Yaron

    2017-01-01

    Despite advances in low-light-level detection, single-photon methods such as photon correlation have rarely been used in the context of imaging. The few demonstrations, for example of subdiffraction-limited imaging utilizing quantum statistics of photons, have remained in the realm of proof-of-principle demonstrations. This is primarily due to a combination of low values of fill factors, quantum efficiencies, frame rates and signal-to-noise characteristic of most available single-photon sensitive imaging detectors. Here we describe an imaging device based on a fibre bundle coupled to single-photon avalanche detectors that combines a large fill factor, a high quantum efficiency, a low noise and scalable architecture. Our device enables localization-based super-resolution microscopy in a non-sparse non-stationary scene, utilizing information on the number of active emitters, as gathered from non-classical photon statistics. PMID:28287167

  3. Change detection with heterogeneous data using ecoregional stratification, statistical summaries and a land allocation algorithm

    Treesearch

    Kathleen M. Bergen; Daniel G. Brown; James F. Rutherford; Eric J. Gustafson

    2005-01-01

    A ca. 1980 national-scale land-cover classification based on aerial photo interpretation was combined with 2000 AVHRR satellite imagery to derive land cover and land-cover change information for forest, urban, and agriculture categories over a seven-state region in the U.S. To derive useful land-cover change data using a heterogeneous dataset and to validate our...

  4. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  5. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  6. Bounds on the minimum number of recombination events in a sample history.

    PubMed Central

    Myers, Simon R; Griffiths, Robert C

    2003-01-01

    Recombination is an important evolutionary factor in many organisms, including humans, and understanding its effects is an important task facing geneticists. Detecting past recombination events is thus important; this article introduces statistics that give a lower bound on the number of recombination events in the history of a sample, on the basis of the patterns of variation in the sample DNA. Such lower bounds are appropriate, since many recombination events in the history are typically undetectable, so the true number of historical recombinations is unobtainable. The statistics can be calculated quickly by computer and improve upon the earlier bound of Hudson and Kaplan 1985. A method is developed to combine bounds on local regions in the data to produce more powerful improved bounds. The method is flexible to different models of recombination occurrence. The approach gives recombination event bounds between all pairs of sites, to help identify regions with more detectable recombinations, and these bounds can be viewed graphically. Under coalescent simulations, there is a substantial improvement over the earlier method (of up to a factor of 2) in the expected number of recombination events detected by one of the new minima, across a wide range of parameter values. The method is applied to data from a region within the lipoprotein lipase gene and the amount of detected recombination is substantially increased. Further, there is strong clustering of detected recombination events in an area near the center of the region. A program implementing these statistics, which was used for this article, is available from http://www.stats.ox.ac.uk/mathgen/programs.html. PMID:12586723

  7. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    NASA Astrophysics Data System (ADS)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  8. Detecting Biosphere anomalies hotspots

    NASA Astrophysics Data System (ADS)

    Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim

    2017-04-01

    The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint distribution. The proposed methodology has been applied to different areas around the globe. The results show that the method is able to detect historic events and also provides a useful tool to define sensitive regions. This method and results have been developed within the framework of the project BACI (http://baci-h2020.eu/), which aims to integrate Earth Observation data to monitor the earth system and assessing the impacts of terrestrial changes. [1] V. Chandola, A., Banerjee and v., Kumar. Anomaly detection: a survey. ACM computing surveys (CSUR), vol. 41, n. 3, 2009. [2] P. Mahalanobis. On the generalised distance in statistics. Proceedings National Institute of Science, vol. 2, pp 49-55, 1936.

  9. Fixed forced detection for fast SPECT Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Cajgfinger, T.; Rit, S.; Létang, J. M.; Halty, A.; Sarrut, D.

    2018-03-01

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  10. Fixed forced detection for fast SPECT Monte-Carlo simulation.

    PubMed

    Cajgfinger, T; Rit, S; Létang, J M; Halty, A; Sarrut, D

    2018-03-02

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  11. [Do different interpretative methods used for evaluation of checkerboard synergy test affect the results?].

    PubMed

    Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent

    2012-07-01

    In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in 96-well checkerboard plates. The results were obtained separately using the four different interpretation methods frequently preferred by researchers. Thus, it was aimed to detect to what extent the rates of synergistic, indifferent and antagonistic interactions were affected by different interpretation methods. The differences between the interpretation methods were tested by chi-square analysis for each combination used. Statistically significant differences were detected between the four different interpretation methods for the determination of synergistic and indifferent interactions (p< 0.0001). Highest rates of synergy were observed with both combinations by the method that used the lowest fractional inhibitory concentration index of all the non-turbid wells along the turbidity/non-turbidity interface. There was no statistically significant difference between the four methods for the detection of antagonism (p> 0.05). In conclusion although there is a standard procedure for checkerboard synergy testing it fails to exhibit standard results owing to different methods of interpretation of the results. Thus, there is a need to standardise the interpretation method for checkerboard synergy testing. To determine the most appropriate method of interpretation further studies investigating the clinical benefits of synergic combinations and additionally comparing the consistency of the results obtained from the other standard combination tests like time-kill studies, are required.

  12. ICPD-A New Peak Detection Algorithm for LC/MS

    PubMed Central

    2010-01-01

    Background The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. Results In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. Conclusions The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods. PMID:21143790

  13. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces.

    PubMed

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  14. Current standard rules of combined anteversion prevent prosthetic impingement but ignore osseous contact in total hip arthroplasty.

    PubMed

    Weber, Markus; Woerner, Michael; Craiovan, Benjamin; Voellner, Florian; Worlicek, Michael; Springorum, Hans-Robert; Grifka, Joachim; Renkawitz, Tobias

    2016-12-01

    In this prospective study of 135 patients undergoing cementless total hip arthroplasty (THA) we asked whether six current definitions of combined anteversion prevent impingement and increase postoperative patient individual impingement-free range-of-motion (ROM). Implant position was measured by an independent, external institute on 3D-CT performed six weeks post-operatively. Post-operative ROM was calculated using a CT-based algorithm detecting osseous and/or prosthetic impingement by virtual hip movement. Additionally, clinical ROM was evaluated pre-operatively and one-year post-operatively by a blinded observer. Combined component position of cup and stem according to the definitions of Ranawat, Widmer, Dorr, Hisatome and Yoshimine inhibited prosthetic impingement in over 90 %, while combined osseous and prosthetic impingement still occurred in over 40 % of the cases. The recommendations by Jolles, Widmer, Dorr, Yoshimine and Hisatome enabled higher flexion (p ≤ 0.001) and internal rotation (p ≤ 0.006). Clinically, anteversion rules of Widmer and Yoshimine provided one-year post-operatively statistically but not clinically relevant higher internal rotation (p ≤0.034). Standard rules of combined anteversion detect prosthetic but fail to prevent combined osseous and prosthetic impingement in THA. Future models will have to account for the patient-individual anatomic situation to ensure impingement-free ROM.

  15. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    PubMed Central

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  16. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  17. Detection of genotoxic effects of drinking water disinfection by-products using Vicia faba bioassay.

    PubMed

    Hu, Yu; Tan, Li; Zhang, Shao-Hui; Zuo, Yu-Ting; Han, Xue; Liu, Na; Lu, Wen-Qing; Liu, Ai-Lin

    2017-01-01

    Plant-based bioassays have gained wide use among the toxicological and/or ecotoxicological assessment procedures because of their simplicity, sensitivity, low cost, and reliability. The present study describes the use of Vicia faba (V. faba) micronucleus (MN) test and V. faba comet assay in the evaluation of the genotoxic potential of disinfection by-products (DBPs) commonly found in chlorine-disinfected drinking water. Five haloacetic acids and three halogenated acetonitriles were chosen as representatives of DBPs in this study because they are of potentially great public health risk. Results of the MN test indicated that monochloroacetic acid (MCA), monobromoacetic acid (MBA), dichloroacetic acid (DCA), dibromoacetic acid (DBA), trichloroacetic acid (TCA), and trichloroacetonitrile (TCAN) caused a statistically significant increase in MN frequency in V. faba root tip cells. However, no genotoxic response was observed for dichloroacetonitrile (DCAN) and dibromoacetonitrile (DBAN). Results of the comet assay showed that all tested DBPs induced a statistically significant increase in genomic DNA damage to V. faba root tip cells. On considering the capacity to detect genomic damage of a different nature, we suggest that a combination of V. faba MN test and V. faba comet assay is a useful tool for the detection of genotoxic effects of DBPs. It is worthy of assessing the feasibility of using V. faba comet assay combined with V. faba MN test to screen for the genotoxic activity of chlorinated drinking water in future work.

  18. Whole vertebral bone segmentation method with a statistical intensity-shape model based approach

    NASA Astrophysics Data System (ADS)

    Hanaoka, Shouhei; Fritscher, Karl; Schuler, Benedikt; Masutani, Yoshitaka; Hayashi, Naoto; Ohtomo, Kuni; Schubert, Rainer

    2011-03-01

    An automatic segmentation algorithm for the vertebrae in human body CT images is presented. Especially we focused on constructing and utilizing 4 different statistical intensity-shape combined models for the cervical, upper / lower thoracic and lumbar vertebrae, respectively. For this purpose, two previously reported methods were combined: a deformable model-based initial segmentation method and a statistical shape-intensity model-based precise segmentation method. The former is used as a pre-processing to detect the position and orientation of each vertebra, which determines the initial condition for the latter precise segmentation method. The precise segmentation method needs prior knowledge on both the intensities and the shapes of the objects. After PCA analysis of such shape-intensity expressions obtained from training image sets, vertebrae were parametrically modeled as a linear combination of the principal component vectors. The segmentation of each target vertebra was performed as fitting of this parametric model to the target image by maximum a posteriori estimation, combined with the geodesic active contour method. In the experimental result by using 10 cases, the initial segmentation was successful in 6 cases and only partially failed in 4 cases (2 in the cervical area and 2 in the lumbo-sacral). In the precise segmentation, the mean error distances were 2.078, 1.416, 0.777, 0.939 mm for cervical, upper and lower thoracic, lumbar spines, respectively. In conclusion, our automatic segmentation algorithm for the vertebrae in human body CT images showed a fair performance for cervical, thoracic and lumbar vertebrae.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Satyabrata; Rao, Nageswara S; Wu, Qishi

    There have been increasingly large deployments of radiation detection networks that require computationally fast algorithms to produce prompt results over ad-hoc sub-networks of mobile devices, such as smart-phones. These algorithms are in sharp contrast to complex network algorithms that necessitate all measurements to be sent to powerful central servers. In this work, at individual sensors, we employ Wald-statistic based detection algorithms which are computationally very fast, and are implemented as one of three Z-tests and four chi-square tests. At fusion center, we apply the K-out-of-N fusion to combine the sensors hard decisions. We characterize the performance of detection methods bymore » deriving analytical expressions for the distributions of underlying test statistics, and by analyzing the fusion performances in terms of K, N, and the false-alarm rates of individual detectors. We experimentally validate our methods using measurements from indoor and outdoor characterization tests of the Intelligence Radiation Sensors Systems (IRSS) program. In particular, utilizing the outdoor measurements, we construct two important real-life scenarios, boundary surveillance and portal monitoring, and present the results of our algorithms.« less

  20. Statistical evaluation of variables affecting occurrence of hydrocarbons in aquifers used for public supply, California

    USGS Publications Warehouse

    Landon, Matthew K.; Burton, Carmen A.; Davis, Tracy A.; Belitz, Kenneth; Johnson, Tyler D.

    2014-01-01

    The variables affecting the occurrence of hydrocarbons in aquifers used for public supply in California were assessed based on statistical evaluation of three large statewide datasets; gasoline oxygenates also were analyzed for comparison with hydrocarbons. Benzene is the most frequently detected (1.7%) compound among 17 hydrocarbons analyzed at generally low concentrations (median detected concentration 0.024 μg/l) in groundwater used for public supply in California; methyl tert-butyl ether (MTBE) is the most frequently detected (5.8%) compound among seven oxygenates analyzed (median detected concentration 0.1 μg/l). At aquifer depths used for public supply, hydrocarbons and MTBE rarely co-occur and are generally related to different variables; in shallower groundwater, co-occurrence is more frequent and there are similar relations to the density or proximity of potential sources. Benzene concentrations are most strongly correlated with reducing conditions, regardless of groundwater age and depth. Multiple lines of evidence indicate that benzene and other hydrocarbons detected in old, deep, and/or brackish groundwater result from geogenic sources of oil and gas. However, in recently recharged (since ~1950), generally shallower groundwater, higher concentrations and detection frequencies of benzene and hydrocarbons were associated with a greater proportion of commercial land use surrounding the well, likely reflecting effects of anthropogenic sources, particularly in combination with reducing conditions.

  1. 'Silent' and 'noisy' areas: acute flaccid paralysis surveillance at subnational level, Australia, 2001-2015.

    PubMed

    Butler, Michelle; Paterson, Beverley J; Martin, Nicolee; Hobday, Linda; Thorley, Bruce; Durrheim, David N

    2017-05-01

    Acute flaccid paralysis (AFP) surveillance rates are used as an indicator of surveillance sensitivity to detect poliomyelitis with an expected rate of ≥1 case per 100 000 population in children under 15 years of age. The Australian AFP detection rates at sub-national (statistical local area) level were analysed using χ2 goodness of fit tests and exact Poisson probabilities for the combined years 2001-2015 to detect 'silent areas', which may require improved AFP detection efforts, and areas with greater than expected rates, which may indicate unexplained clusters such as those due to enterovirus infection. Eight (n=8/87, 9%) local areas had AFP surveillance detection rates that were less than expected, and eighteen local areas (n=18/87, 21%) had rates that were greater than expected. However, based on available evidence, it is unlikely that these indicated previously unidentified, enterovirus clusters. While Australia has regularly met the national AFP surveillance performance indicators, at the subnational level nine per cent of local areas demonstrated statistically significant lower AFP detection rates. All countries, even those with relatively small populations, should actively identify silent AFP areas to prompt surveillance improvements. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Nonparametric rank regression for analyzing water quality concentration data with multiple detection limits.

    PubMed

    Fu, Liya; Wang, You-Gan

    2011-02-15

    Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which clearly demonstrates the advantages of the rank regression models.

  3. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  4. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  5. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video

    PubMed Central

    Ghosh, Tonmoy; Wahid, Khan A.

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data. PMID:29468094

  6. Finding Statistically Significant Communities in Networks

    PubMed Central

    Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo

    2011-01-01

    Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480

  7. The role of haemorrhage and exudate detection in automated grading of diabetic retinopathy.

    PubMed

    Fleming, Alan D; Goatman, Keith A; Philip, Sam; Williams, Graeme J; Prescott, Gordon J; Scotland, Graham S; McNamee, Paul; Leese, Graham P; Wykes, William N; Sharp, Peter F; Olson, John A

    2010-06-01

    Automated grading has the potential to improve the efficiency of diabetic retinopathy screening services. While disease/no disease grading can be performed using only microaneurysm detection and image-quality assessment, automated recognition of other types of lesions may be advantageous. This study investigated whether inclusion of automated recognition of exudates and haemorrhages improves the detection of observable/referable diabetic retinopathy. Images from 1253 patients with observable/referable retinopathy and 6333 patients with non-referable retinopathy were obtained from three grading centres. All images were reference-graded, and automated disease/no disease assessments were made based on microaneurysm detection and combined microaneurysm, exudate and haemorrhage detection. Introduction of algorithms for exudates and haemorrhages resulted in a statistically significant increase in the sensitivity for detection of observable/referable retinopathy from 94.9% (95% CI 93.5 to 96.0) to 96.6% (95.4 to 97.4) without affecting manual grading workload. Automated detection of exudates and haemorrhages improved the detection of observable/referable retinopathy.

  8. Multivariate two-part statistics for analysis of correlated mass spectrometry data from multiple biological specimens.

    PubMed

    Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi

    2017-01-01

    High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  10. Statistical guidelines for assessing marine avian hotspots and coldspots: A case study on wind energy development in the U.S. Atlantic Ocean

    USGS Publications Warehouse

    Zipkin, Elise F.; Kinlan, Brian P.; Sussman, Allison; Rypkema, Diana; Wimer, Mark; O'Connell, Allan F.

    2015-01-01

    Estimating patterns of habitat use is challenging for marine avian species because seabirds tend to aggregate in large groups and it can be difficult to locate both individuals and groups in vast marine environments. We developed an approach to estimate the statistical power of discrete survey events to identify species-specific hotspots and coldspots of long-term seabird abundance in marine environments. We illustrate our approach using historical seabird data from survey transects in the U.S. Atlantic Ocean Outer Continental Shelf (OCS), an area that has been divided into “lease blocks” for proposed offshore wind energy development. For our power analysis, we examined whether discrete lease blocks within the region could be defined as hotspots (3 × mean abundance in the OCS) or coldspots (1/3 ×) for individual species within a given season. For each of 74 species/season combinations, we determined which of eight candidate statistical distributions (ranging in their degree of skewedness) best fit the count data. We then used the selected distribution and estimates of regional prevalence to calculate and map statistical power to detect hotspots and coldspots, and estimate the p-value from Monte Carlo significance tests that specific lease blocks are in fact hotspots or coldspots relative to regional average abundance. The power to detect species-specific hotspots was higher than that of coldspots for most species because species-specific prevalence was relatively low (mean: 0.111; SD: 0.110). The number of surveys required for adequate power (> 0.6) was large for most species (tens to hundreds) using this hotspot definition. Regulators may need to accept higher proportional effect sizes, combine species into groups, and/or broaden the spatial scale by combining lease blocks in order to determine optimal placement of wind farms. Our power analysis approach provides a general framework for both retrospective analyses and future avian survey design and is applicable to a broad range of research and conservation problems.

  11. Classification of edible oils by employing 31P and 1H NMR spectroscopy in combination with multivariate statistical analysis. A proposal for the detection of seed oil adulteration in virgin olive oils.

    PubMed

    Vigli, Georgia; Philippidis, Angelos; Spyros, Apostolos; Dais, Photis

    2003-09-10

    A combination of (1)H NMR and (31)P NMR spectroscopy and multivariate statistical analysis was used to classify 192 samples from 13 types of vegetable oils, namely, hazelnut, sunflower, corn, soybean, sesame, walnut, rapeseed, almond, palm, groundnut, safflower, coconut, and virgin olive oils from various regions of Greece. 1,2-Diglycerides, 1,3-diglycerides, the ratio of 1,2-diglycerides to total diglycerides, acidity, iodine value, and fatty acid composition determined upon analysis of the respective (1)H NMR and (31)P NMR spectra were selected as variables to establish a classification/prediction model by employing discriminant analysis. This model, obtained from the training set of 128 samples, resulted in a significant discrimination among the different classes of oils, whereas 100% of correct validated assignments for 64 samples were obtained. Different artificial mixtures of olive-hazelnut, olive-corn, olive-sunflower, and olive-soybean oils were prepared and analyzed by (1)H NMR and (31)P NMR spectroscopy. Subsequent discriminant analysis of the data allowed detection of adulteration as low as 5% w/w, provided that fresh virgin olive oil samples were used, as reflected by their high 1,2-diglycerides to total diglycerides ratio (D > or = 0.90).

  12. Multispectral processing based on groups of resolution elements

    NASA Technical Reports Server (NTRS)

    Richardson, W.; Gleason, J. M.

    1975-01-01

    Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.

  13. Evaluation of 3M™ Molecular Detection Assay (MDA) Listeria for the Detection of Listeria species in Selected Foods and Environmental Surfaces: Collaborative Study, First Action 2014.06.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Monteroso, Lisa; Benesh, DeAnn

    2015-01-01

    The 3M™ Molecular Detection Assay (MDA) Listeria is used with the 3M Molecular Detection System for the detection of Listeria species in food, food-related, and environmental samples after enrichment. The assay utilizes loop-mediated isothermal amplification to rapidly amplify Listeria target DNA with high specificity and sensitivity, combined with bioluminescence to detect the amplification. The 3M MDA Listeria method was evaluated using an unpaired study design in a multilaboratory collaborative study and compared to the AOAC Official Method of AnalysisSM (OMA) 993.12 Listeria monocytogenes in Milk and Dairy Products reference method for the detection of Listeria species in full-fat (4% milk fat) cottage cheese (25 g test portions). A total of 15 laboratories located in the continental United States and Canada participated. Each matrix had three inoculation levels: an uninoculated control level (0 CFU/test portion), and two levels artificially contaminated with Listeria monocytogenes, a low inoculum level (0.2-2 CFU/test portion) and a high inoculum level (2-5 CFU/test portion) using nonheat-stressed cells. In total, 792 unpaired replicate portions were analyzed. Statistical analysis was conducted according to the probability of detection (POD) model. Results obtained for the low inoculum level test portions produced a difference in cross-laboratory POD value of -0.07 with a 95% confidence interval of (-0.19, 0.06). No statistically significant differences were observed in the number of positive samples detected by the 3M MDA Listeria method versus the AOAC OMA method.

  14. Single-molecule detection of proteins with antigen-antibody interaction using resistive-pulse sensing of submicron latex particles

    NASA Astrophysics Data System (ADS)

    Takakura, T.; Yanagi, I.; Goto, Y.; Ishige, Y.; Kohara, Y.

    2016-03-01

    We developed a resistive-pulse sensor with a solid-state pore and measured the latex agglutination of submicron particles induced by antigen-antibody interaction for single-molecule detection of proteins. We fabricated the pore based on numerical simulation to clearly distinguish between monomer and dimer latex particles. By measuring single dimers agglutinated in the single-molecule regime, we detected single human alpha-fetoprotein molecules. Adjusting the initial particle concentration improves the limit of detection (LOD) to 95 fmol/l. We established a theoretical model of the LOD by combining the reaction kinetics and the counting statistics to explain the effect of initial particle concentration on the LOD. The theoretical model shows how to improve the LOD quantitatively. The single-molecule detection studied here indicates the feasibility of implementing a highly sensitive immunoassay by a simple measurement method using resistive-pulse sensing.

  15. A new method for skin color enhancement

    NASA Astrophysics Data System (ADS)

    Zeng, Huanzhao; Luo, Ronnier

    2012-01-01

    Skin tone is the most important color category in memory colors. Reproducing it pleasingly is an important factor in photographic color reproduction. Moving skin colors toward their preferred skin color center improves the skin color preference on photographic color reproduction. Two key factors to successfully enhance skin colors are: a method to detect original skin colors effectively even if they are shifted far away from the regular skin color region, and a method to morph skin colors toward a preferred skin color region properly without introducing artifacts. A method for skin color enhancement presented by the authors in the same conference last year applies a static skin color model for skin color detection, which may miss to detect skin colors that are far away from regular skin tones. In this paper, a new method using the combination of face detection and statistical skin color modeling is proposed to effectively detect skin pixels and to enhance skin colors more effectively.

  16. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Chemical entity recognition in patents by combining dictionary-based and statistical approaches.

    PubMed

    Akhondi, Saber A; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F H; Hettne, Kristina M; van Mulligen, Erik M; Kors, Jan A

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small.Database URL: http://biosemantics.org/chemdner-patents. © The Author(s) 2016. Published by Oxford University Press.

  18. Cracking the Language Code: Neural Mechanisms Underlying Speech Parsing

    PubMed Central

    McNealy, Kristin; Mazziotta, John C.; Dapretto, Mirella

    2013-01-01

    Word segmentation, detecting word boundaries in continuous speech, is a critical aspect of language learning. Previous research in infants and adults demonstrated that a stream of speech can be readily segmented based solely on the statistical and speech cues afforded by the input. Using functional magnetic resonance imaging (fMRI), the neural substrate of word segmentation was examined on-line as participants listened to three streams of concatenated syllables, containing either statistical regularities alone, statistical regularities and speech cues, or no cues. Despite the participants’ inability to explicitly detect differences between the speech streams, neural activity differed significantly across conditions, with left-lateralized signal increases in temporal cortices observed only when participants listened to streams containing statistical regularities, particularly the stream containing speech cues. In a second fMRI study, designed to verify that word segmentation had implicitly taken place, participants listened to trisyllabic combinations that occurred with different frequencies in the streams of speech they just heard (“words,” 45 times; “partwords,” 15 times; “nonwords,” once). Reliably greater activity in left inferior and middle frontal gyri was observed when comparing words with partwords and, to a lesser extent, when comparing partwords with nonwords. Activity in these regions, taken to index the implicit detection of word boundaries, was positively correlated with participants’ rapid auditory processing skills. These findings provide a neural signature of on-line word segmentation in the mature brain and an initial model with which to study developmental changes in the neural architecture involved in processing speech cues during language learning. PMID:16855090

  19. Automatic Detection of Diseased Tomato Plants Using Thermal and Stereo Visible Light Images

    PubMed Central

    Raza, Shan-e-Ahmed; Prince, Gillian; Clarkson, John P.; Rajpoot, Nasir M.

    2015-01-01

    Accurate and timely detection of plant diseases can help mitigate the worldwide losses experienced by the horticulture and agriculture industries each year. Thermal imaging provides a fast and non-destructive way of scanning plants for diseased regions and has been used by various researchers to study the effect of disease on the thermal profile of a plant. However, thermal image of a plant affected by disease has been known to be affected by environmental conditions which include leaf angles and depth of the canopy areas accessible to the thermal imaging camera. In this paper, we combine thermal and visible light image data with depth information and develop a machine learning system to remotely detect plants infected with the tomato powdery mildew fungus Oidium neolycopersici. We extract a novel feature set from the image data using local and global statistics and show that by combining these with the depth information, we can considerably improve the accuracy of detection of the diseased plants. In addition, we show that our novel feature set is capable of identifying plants which were not originally inoculated with the fungus at the start of the experiment but which subsequently developed disease through natural transmission. PMID:25861025

  20. Detection of cervical intraepithelial neoplasia by using optical coherence tomography in combination with microscopy

    NASA Astrophysics Data System (ADS)

    Gallwas, Julia; Jalilova, Aydan; Ladurner, Roland; Kolben, Theresa Maria; Kolben, Thomas; Ditsch, Nina; Homann, Christian; Lankenau, Eva; Dannecker, Christian

    2017-01-01

    Optical coherence tomography (OCT) is a noninvasive high-resolution imaging technique that permits the detection of cancerous and precancerous lesions of the uterine cervix. The purpose of this study was to evaluate a new system that integrates an OCT device into a microscope. OCT images were taken from loop electrosurgical excision procedure (LEEP) specimens under microscopic guidance. The images were blinded with respect to their origin within the microscopic image and analyzed independently by two investigators using initially defined criteria and later compared to the corresponding histology. Sensitivity and specificity were calculated with respect to the correct identification of high-grade squamous intraepithelial lesions (HSIL). The interinvestigator agreement was assessed by using Cohen's kappa statistics. About 160 OCT images were obtained from 20 LEEP specimens. Sixty randomly chosen images were used to define reproducible criteria for evaluation. The assessment of the remaining 100 images showed a sensitivity of 88% (second investigator 84%) and a specificity of 69% (65%) in detecting HSIL. Surgical microscopy-guided OCT appears to be a promising technique for immediate assessment of microanatomical changes. In the gynecological setting, the combination of OCT with a colposcope may improve the detection of high-grade squamous intraepithelial lesions.

  1. Multipath detection with the combination of SNR measurements - Example from urban environment

    NASA Astrophysics Data System (ADS)

    Špánik, Peter; Hefty, Ján

    2017-12-01

    Multipath is one of the most severe station-dependent error sources in both static and kinematic positioning. Relatively new and simple detection technique using the Signal-to-Noise (SNR) measurements on three frequencies will be presented based on idea of Strode and Groves. Exploitation of SNR measurements is benefi cial especially for their unambiguous character. Method is based on the fact that SNR values are closely linked with estimation of pseudo-ranges and phase measurements during signal correlation processing. Due to this connection, combination of SNR values can be used to detect anomalous behavior in received signal, however some kind of calibration in low multipath environment has to be done previously. In case of multipath, phase measurements on different frequencies will not be affected in the same manner. Specular multipath, e.g. from building wall introduces additional path delay which is interpreted differently on each of the used carrier, due to different wavelengths. Experimental results of multipath detection in urban environment will be presented. Originally proposed method is designed to work with three different frequencies in each epoch, thus only utilization of GPS Block II-F and Galileo satellites is possible. Simplification of detection statistics to use only two frequencies is made and results using GPS and GLONASS systems are presented along with results obtained using original formula.

  2. Niche harmony search algorithm for detecting complex disease associated high-order SNP combinations.

    PubMed

    Tuo, Shouheng; Zhang, Junying; Yuan, Xiguo; He, Zongzhen; Liu, Yajun; Liu, Zhaowen

    2017-09-14

    Genome-wide association study is especially challenging in detecting high-order disease-causing models due to model diversity, possible low or even no marginal effect of the model, and extraordinary search and computations. In this paper, we propose a niche harmony search algorithm where joint entropy is utilized as a heuristic factor to guide the search for low or no marginal effect model, and two computationally lightweight scores are selected to evaluate and adapt to diverse of disease models. In order to obtain all possible suspected pathogenic models, niche technique merges with HS, which serves as a taboo region to avoid HS trapping into local search. From the resultant set of candidate SNP-combinations, we use G-test statistic for testing true positives. Experiments were performed on twenty typical simulation datasets in which 12 models are with marginal effect and eight ones are with no marginal effect. Our results indicate that the proposed algorithm has very high detection power for searching suspected disease models in the first stage and it is superior to some typical existing approaches in both detection power and CPU runtime for all these datasets. Application to age-related macular degeneration (AMD) demonstrates our method is promising in detecting high-order disease-causing models.

  3. A population study of hot Jupiter atmospheres

    NASA Astrophysics Data System (ADS)

    Tsiaras, A.; Waldmann, I. P.; Zingales, T.; Rocchetto, M.; Damiano, M.; Karpouzas, K.; Tinetti, G.; McKemmish, L. K.; Tennyson, J.; Yrchenko, S. N.

    2017-09-01

    In the past two decades, we have learnt that every star hosts more than one planet. While the hunt for new exoplanets is on-going, the current sample of more than 3500 confirmed planets reveals a wide spectrum of planetary characteristics. While small planets appear to be the most common, the big and gaseous planets play a key role in the process of planetary formation. We present here the analysis of 30 gaseous extra-solar planets, with temperatures between 600 and 2400 K and radii between 0.35 and 1.9 Jupiter radii. These planets were spectroscopically observed with the Wide Field Camera 3 on-board the Hubble Space Telescope, which is currently one of the most successful instruments for observing exoplanetary atmospheres. The quality of the HST/WFC3 spatially-scanned data combined with our specialised analysis tools, allows us to create the largest and most self-consistent sample of exoplanetary transmission spectra to date and study the collective behaviour of warm and hot gaseous planets rather than isolated case-studies. We define a new metric, the Atmospheric Detectability Index (ADI) to evaluate the statistical significance of an atmospheric detection and find statistically significant atmospheres around 16 planets. For most of the Jupiters in our sample we find the detectability of their atmospheres to be dependent on the planetary radius but not on the planetary mass. This indicates that planetary gravity is a secondary factor in the evolution of planetary atmospheres. We detect the presence of water vapour in all the statistically detectable atmospheres and we cannot rule out its presence in the atmospheres of the others. In addition, TiO and/or VO signatures are detected with 4σ confidence in WASP-76 b, and they are most likely present on WASP-121 b. We find no correlation between expected signal-to-noise and atmospheric detectability for most targets. This has important implications for future large-scale surveys.

  4. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline

    PubMed Central

    2013-01-01

    Background As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. Results We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS A : DE genes with non-zero effect sizes in all studies, (2) HS B : DE genes with non-zero effect sizes in one or more studies and (3) HS r : DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. Conclusions The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS A , HS B , and HS r ). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author’s publication website. PMID:24359104

  5. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline.

    PubMed

    Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C

    2013-12-21

    As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.

  6. A flexibly shaped space-time scan statistic for disease outbreak detection and monitoring.

    PubMed

    Takahashi, Kunihiko; Kulldorff, Martin; Tango, Toshiro; Yih, Katherine

    2008-04-11

    Early detection of disease outbreaks enables public health officials to implement disease control and prevention measures at the earliest possible time. A time periodic geographical disease surveillance system based on a cylindrical space-time scan statistic has been used extensively for disease surveillance along with the SaTScan software. In the purely spatial setting, many different methods have been proposed to detect spatial disease clusters. In particular, some spatial scan statistics are aimed at detecting irregularly shaped clusters which may not be detected by the circular spatial scan statistic. Based on the flexible purely spatial scan statistic, we propose a flexibly shaped space-time scan statistic for early detection of disease outbreaks. The performance of the proposed space-time scan statistic is compared with that of the cylindrical scan statistic using benchmark data. In order to compare their performances, we have developed a space-time power distribution by extending the purely spatial bivariate power distribution. Daily syndromic surveillance data in Massachusetts, USA, are used to illustrate the proposed test statistic. The flexible space-time scan statistic is well suited for detecting and monitoring disease outbreaks in irregularly shaped areas.

  7. Correlation between the different therapeutic properties of Chinese medicinal herbs and delayed luminescence.

    PubMed

    Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang

    2016-03-01

    In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  9. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  10. Introductory comments on the USGS geographic applications program

    NASA Technical Reports Server (NTRS)

    Gerlach, A. C.

    1970-01-01

    The third phase of remote sensing technologies and potentials applied to the operations of the U.S. Geological Survey is introduced. Remote sensing data with multidisciplinary spatial data from traditional sources is combined with geographic theory and techniques of environmental modeling. These combined imputs are subject to four sequential activities that involve: (1) thermatic mapping of land use and environmental factors; (2) the dynamics of change detection; (3) environmental surveillance to identify sudden changes and general trends; and (4) preparation of statistical model and analytical reports. Geography program functions, products, clients, and goals are presented in graphical form, along with aircraft photo missions, geography test sites, and FY-70.

  11. The evolution of supernova remnants in different galactic environments, and its effects on supernova statistics

    NASA Technical Reports Server (NTRS)

    Kafatos, M.; Sofia, S.; Bruhweiler, F.; Gull, T. R.

    1980-01-01

    Examination of the interaction between supernova (SN) ejecta and the various environments in which the explosive event might occur shows that only a small fraction of the many SNs produce observable supernova remnants (SNRs). This fraction, which is found to depend weakly upon the lower mass limit of the SN progenitors, and more strongly on the specfic characteristics of the associated interstellar medium, decreases from approximately 15 percent near the galctic center to 10 percent at Rgal approximately 10 kpc and drops nearly to zero for Rgal 15 kpc. Generally, whether a SNR is detectable is determined by the density of the ambient interstellar medium in which it is embeeede. The presence of large, low density cavities arpund stellar associations due to the combined effects of stellar winds and supernova shells strongly suggests that a large portion of the detectable SNRs have runway stars as their progenitors. These results explain the differences between the substantially larger SN rates in the galaxy derived both from pulsar statistics and from observations of SN events in external galaxies, when compared to the substantially smaller SN rates derived form galactic SNR statistics.

  12. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  13. Quantitative analysis of trace levels of surface contamination by X-ray photoelectron spectroscopy Part I: statistical uncertainty near the detection limit.

    PubMed

    Hill, Shannon B; Faradzhev, Nadir S; Powell, Cedric J

    2017-12-01

    We discuss the problem of quantifying common sources of statistical uncertainties for analyses of trace levels of surface contamination using X-ray photoelectron spectroscopy. We examine the propagation of error for peak-area measurements using common forms of linear and polynomial background subtraction including the correlation of points used to determine both background and peak areas. This correlation has been neglected in previous analyses, but we show that it contributes significantly to the peak-area uncertainty near the detection limit. We introduce the concept of relative background subtraction variance (RBSV) which quantifies the uncertainty introduced by the method of background determination relative to the uncertainty of the background area itself. The uncertainties of the peak area and atomic concentration and of the detection limit are expressed using the RBSV, which separates the contributions from the acquisition parameters, the background-determination method, and the properties of the measured spectrum. These results are then combined to find acquisition strategies that minimize the total measurement time needed to achieve a desired detection limit or atomic-percentage uncertainty for a particular trace element. Minimization of data-acquisition time is important for samples that are sensitive to x-ray dose and also for laboratories that need to optimize throughput.

  14. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI models such as, the...detection and discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI...Shubitidze of Sky Research and Dartmouth College, conceived, implemented , and tested most of the approaches presented in this report. He developed

  15. Binary Programming Models of Spatial Pattern Recognition: Applications in Remote Sensing Image Analysis

    DTIC Science & Technology

    1991-12-01

    9 2.6.1 Multi-Shape Detection. .. .. .. .. .. .. ...... 9 Page 2.6.2 Line Segment Extraction and Re-Combination.. 9 2.6.3 Planimetric Feature... Extraction ............... 10 2.6.4 Line Segment Extraction From Statistical Texture Analysis .............................. 11 2.6.5 Edge Following as Graph...image after image, could benefit clue to the fact that major spatial characteristics of subregions could be extracted , and minor spatial changes could be

  16. [Correlation of chromosome 1p and 19q status and expression of R132H mutant IDH1 protein in oligodendroglial tumors].

    PubMed

    Yao, Kun; Duan, Zejun; Hu, Zeliang; Bian, Yu; Qi, Xueling

    2014-10-01

    To correlate the presence of chromosome 1p/19q deletion with the expression of R132H mutant IDH1 status in oligodendroglial tumors, and to explore molecular markers for predicting chemosensitivity of oligodendroglial tumors. The study included 75 oligodendroglial tumors (38 oligodendrogliomas and 37 oligoastrocytomas). Immunohistochemistry was used to detect the expression of R132H mutant IDH1 protein, and fluorescence in situ hybridization (FISH) was employed to detect 1p/19q deletion. Deletion of chromosome 1p and/or 19q was detected in 37 cases (37/75, 49.3%), among which co-deletion of 1p and 19q was seen in 34 cases (closely correlated, P < 0.01). Oligodendrogliomas WHOIIhad a slightly higher deletion rate than oligodendrogliomas WHO III, although without statistical significance. Oligodendrogliomas WHO IIand WHO III had a significantly higher deletion rate of chromosome 1p/19q than oligoastrocytomas WHO II and WHO III (P < 0.05). While combined loss of 1p/19q was always detected in oligodendrogliomas when FISH was positive, isolated 1p or 19q deletion was only found in oligoastrocytomas. The expression of R132H mutant IDH1 was detected in 51 of 75 cases (68.0%), in which oligodendrogliomas had a higher positive rate than oligoastrocytomas. Statistical analysis demonstrated a significant correlation between the expression of R132H mutant IDH1 protein and the presence of combined 1p/19q deletion in oligodendrogliomas (P < 0.05). A significant correlation was observed between the expression of R132H mutant protein and 1p/19q LOH.Expression of 132H mutant IDH1 protein is the potential biomarker for predicating the presence of 1p/19q deletion in oligodendrogliomas.

  17. Comparison of 1.5- and 3-T MR imaging for evaluating the articular cartilage of the knee.

    PubMed

    Van Dyck, Pieter; Kenis, Christoph; Vanhoenacker, Filip M; Lambrecht, Valérie; Wouters, Kristien; Gielen, Jan L; Dossche, Lieven; Parizel, Paul M

    2014-06-01

    The aim of this prospective study was to compare routine MRI scans of the knee at 1.5 and 3 T obtained in the same individuals in terms of their performance in the diagnosis of cartilage lesions. One hundred patients underwent MRI of the knee at 1.5 and 3 T and subsequent knee arthroscopy. All MR examinations consisted of multiplanar 2D turbo spin-echo sequences. Three radiologists independently graded all articular surfaces of the knee joint seen at MRI. With arthroscopy as the reference standard, the sensitivity, specificity, and accuracy of 1.5- and 3-T MRI for detecting cartilage lesions and the proportion of correctly graded cartilage lesions within the knee joint were determined and compared using resampling statistics. For all readers and surfaces combined, the respective sensitivity, specificity, and accuracy for detecting all grades of cartilage lesions in the knee joint using MRI were 60, 96, and 87% at 1.5 T and 69, 96, and 90% at 3 T. There was a statistically significant improvement in sensitivity (p < 0.05), but not specificity or accuracy (n.s.) for the detection of cartilage lesions at 3 T. There was also a statistically significant (p < 0.05) improvement in the proportion of correctly graded cartilage lesions at 3 T as compared to 1.5 T. A 3-T MR protocol significantly improves diagnostic performance for the purpose of detecting cartilage lesions within the knee joint, when compared with a similar protocol performed at 1.5 T. III.

  18. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  19. Economic correlates of violent death rates in forty countries, 1962-2008: A cross-typological analysis.

    PubMed

    Lee, Bandy X; Marotta, Phillip L; Blay-Tofey, Morkeh; Wang, Winnie; de Bourmont, Shalila

    2014-01-01

    Our goal was to identify if there might be advantages to combining two major public health concerns, i.e., homicides and suicides, in an analysis with well-established macro-level economic determinants, i.e., unemployment and inequality. Mortality data, unemployment statistics, and inequality measures were obtained for 40 countries for the years 1962-2008. Rates of combined homicide and suicide, ratio of suicide to combined violent death, and ratio between homicide and suicide were graphed and analyzed. A fixed effects regression model was then performed for unemployment rates and Gini coefficients on homicide, suicide, and combined death rates. For a majority of nation states, suicide comprised a substantial proportion (mean 75.51%; range 0-99%) of the combined rate of homicide and suicide. When combined, a small but significant relationship emerged between logged Gini coefficient and combined death rates (0.0066, p < 0.05), suggesting that the combined rate improves the ability to detect a significant relationship when compared to either rate measurement alone. Results were duplicated by age group, whereby combining death rates into a single measure improved statistical power, provided that the association was strong. Violent deaths, when combined, were associated with an increase in unemployment and an increase in Gini coefficient, creating a more robust variable. As the effects of macro-level factors (e.g., social and economic policies) on violent death rates in a population are shown to be more significant than those of micro-level influences (e.g., individual characteristics), these associations may be useful to discover. An expansion of socioeconomic variables and the inclusion of other forms of violence in future research could help elucidate long-term trends.

  20. Economic correlates of violent death rates in forty countries, 1962–2008: A cross-typological analysis

    PubMed Central

    Lee, Bandy X.; Marotta, Phillip L.; Blay-Tofey, Morkeh; Wang, Winnie; de Bourmont, Shalila

    2015-01-01

    Objectives Our goal was to identify if there might be advantages to combining two major public health concerns, i.e., homicides and suicides, in an analysis with well-established macro-level economic determinants, i.e., unemployment and inequality. Methods Mortality data, unemployment statistics, and inequality measures were obtained for 40 countries for the years 1962–2008. Rates of combined homicide and suicide, ratio of suicide to combined violent death, and ratio between homicide and suicide were graphed and analyzed. A fixed effects regression model was then performed for unemployment rates and Gini coefficients on homicide, suicide, and combined death rates. Results For a majority of nation states, suicide comprised a substantial proportion (mean 75.51%; range 0–99%) of the combined rate of homicide and suicide. When combined, a small but significant relationship emerged between logged Gini coefficient and combined death rates (0.0066, p < 0.05), suggesting that the combined rate improves the ability to detect a significant relationship when compared to either rate measurement alone. Results were duplicated by age group, whereby combining death rates into a single measure improved statistical power, provided that the association was strong. Conclusions Violent deaths, when combined, were associated with an increase in unemployment and an increase in Gini coefficient, creating a more robust variable. As the effects of macro-level factors (e.g., social and economic policies) on violent death rates in a population are shown to be more significant than those of micro-level influences (e.g., individual characteristics), these associations may be useful to discover. An expansion of socioeconomic variables and the inclusion of other forms of violence in future research could help elucidate long-term trends. PMID:26028985

  1. Robust automatic line scratch detection in films.

    PubMed

    Newson, Alasdair; Almansa, Andrés; Gousseau, Yann; Pérez, Patrick

    2014-03-01

    Line scratch detection in old films is a particularly challenging problem due to the variable spatiotemporal characteristics of this defect. Some of the main problems include sensitivity to noise and texture, and false detections due to thin vertical structures belonging to the scene. We propose a robust and automatic algorithm for frame-by-frame line scratch detection in old films, as well as a temporal algorithm for the filtering of false detections. In the frame-by-frame algorithm, we relax some of the hypotheses used in previous algorithms in order to detect a wider variety of scratches. This step's robustness and lack of external parameters is ensured by the combined use of an a contrario methodology and local statistical estimation. In this manner, over-detection in textured or cluttered areas is greatly reduced. The temporal filtering algorithm eliminates false detections due to thin vertical structures by exploiting the coherence of their motion with that of the underlying scene. Experiments demonstrate the ability of the resulting detection procedure to deal with difficult situations, in particular in the presence of noise, texture, and slanted or partial scratches. Comparisons show significant advantages over previous work.

  2. Safety Concerns Reported by Patients Identified in a Collaborative Signal Detection Workshop using VigiBase: Results and Reflections from Lareb and Uppsala Monitoring Centre.

    PubMed

    Watson, Sarah; Chandler, Rebecca E; Taavola, Henric; Härmark, Linda; Grundmark, Birgitta; Zekarias, Alem; Star, Kristina; van Hunsel, Florence

    2018-02-01

    Patient reporting in pharmacovigilance is important and contributes to signal detection. However, descriptions of methodologies for using patient reports in signal detection are scarce, and published experiences of how patient reports are used in pharmacovigilance are limited to a few individual countries. Our objective was to explore the contribution of patient reports to global signal detection in VigiBase. Data were retrieved from VigiBase in September 2016. Drug-event-combination series were restricted to those with >50% patient reports, defined as reporter type "Consumer/non-health professional" per E2B reporting standard. vigiRank was applied to patient reports to prioritize combinations for assessment. Product information for healthcare professionals (HCPs) as well as patient information leaflets (PILs) were used as reference for information on adverse drug reactions (ADRs). Staff from the Uppsala Monitoring Centre and the Netherlands Pharmacovigilance Centre Lareb categorized the combinations. Potential signals proceeded to a more in-depth clinical review to determine whether the safety concern should be communicated as a "signal." Of the 212 combinations assessed, 20 (9%) resulted in eight signals communicated within the World Health Organization (WHO) programme for international drug monitoring. Review of PILs revealed insufficient ADR descriptions for patients and examples of poor consistency with product information for HCPs. Patient narratives provided details regarding the experience and impact of ADRs and evidence that patients make causality and personal risk assessments. Safety concerns described in patient reports can be identified in a global database including previously unknown ADRs as well as new aspects of known ADRs. Patient reports provide unique information valuable in signal assessment and should be included in signal detection. Novel approaches to highlighting patient reports in statistical signal detection can further improve the contribution of patient reports to pharmacovigilance.

  3. THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES

    PubMed Central

    Song, Chi; Min, Xiaoyi; Zhang, Heping

    2016-01-01

    The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239

  4. A Context-sensitive Approach to Anonymizing Spatial Surveillance Data: Impact on Outbreak Detection

    PubMed Central

    Cassa, Christopher A.; Grannis, Shaun J.; Overhage, J. Marc; Mandl, Kenneth D.

    2006-01-01

    Objective: The use of spatially based methods and algorithms in epidemiology and surveillance presents privacy challenges for researchers and public health agencies. We describe a novel method for anonymizing individuals in public health data sets by transposing their spatial locations through a process informed by the underlying population density. Further, we measure the impact of the skew on detection of spatial clustering as measured by a spatial scanning statistic. Design: Cases were emergency department (ED) visits for respiratory illness. Baseline ED visit data were injected with artificially created clusters ranging in magnitude, shape, and location. The geocoded locations were then transformed using a de-identification algorithm that accounts for the local underlying population density. Measurements: A total of 12,600 separate weeks of case data with artificially created clusters were combined with control data and the impact on detection of spatial clustering identified by a spatial scan statistic was measured. Results: The anonymization algorithm produced an expected skew of cases that resulted in high values of data set k-anonymity. De-identification that moves points an average distance of 0.25 km lowers the spatial cluster detection sensitivity by less than 4% and lowers the detection specificity less than 1%. Conclusion: A population-density–based Gaussian spatial blurring markedly decreases the ability to identify individuals in a data set while only slightly decreasing the performance of a standardly used outbreak detection tool. These findings suggest new approaches to anonymizing data for spatial epidemiology and surveillance. PMID:16357353

  5. Untargeted Metabolic Quantitative Trait Loci Analyses Reveal a Relationship between Primary Metabolism and Potato Tuber Quality1[W][OA

    PubMed Central

    Carreno-Quintero, Natalia; Acharjee, Animesh; Maliepaard, Chris; Bachem, Christian W.B.; Mumm, Roland; Bouwmeester, Harro; Visser, Richard G.F.; Keurentjes, Joost J.B.

    2012-01-01

    Recent advances in -omics technologies such as transcriptomics, metabolomics, and proteomics along with genotypic profiling have permitted dissection of the genetics of complex traits represented by molecular phenotypes in nonmodel species. To identify the genetic factors underlying variation in primary metabolism in potato (Solanum tuberosum), we have profiled primary metabolite content in a diploid potato mapping population, derived from crosses between S. tuberosum and wild relatives, using gas chromatography-time of flight-mass spectrometry. In total, 139 polar metabolites were detected, of which we identified metabolite quantitative trait loci for approximately 72% of the detected compounds. In order to obtain an insight into the relationships between metabolic traits and classical phenotypic traits, we also analyzed statistical associations between them. The combined analysis of genetic information through quantitative trait locus coincidence and the application of statistical learning methods provide information on putative indicators associated with the alterations in metabolic networks that affect complex phenotypic traits. PMID:22223596

  6. Detection of colorectal neoplasia: Combination of eight blood-based, cancer-associated protein biomarkers.

    PubMed

    Wilhelmsen, Michael; Christensen, Ib J; Rasmussen, Louise; Jørgensen, Lars N; Madsen, Mogens R; Vilandt, Jesper; Hillig, Thore; Klaerke, Michael; Nielsen, Knud T; Laurberg, Søren; Brünner, Nils; Gawel, Susan; Yang, Xiaoqing; Davis, Gerard; Heijboer, Annemieke; Martens, Frans; Nielsen, Hans J

    2017-03-15

    Serological biomarkers may be an option for early detection of colorectal cancer (CRC). The present study assessed eight cancer-associated protein biomarkers in plasma from subjects undergoing first time ever colonoscopy due to symptoms attributable to colorectal neoplasia. Plasma AFP, CA19-9, CEA, hs-CRP, CyFra21-1, Ferritin, Galectin-3 and TIMP-1 were determined in EDTA-plasma using the Abbott ARCHITECT® automated immunoassay platform. Primary endpoints were detection of (i) CRC and high-risk adenoma and (ii) CRC. Logistic regression was performed. Final reduced models were constructed selecting the four biomarkers with the highest likelihood scores. Subjects (N = 4,698) were consecutively included during 2010-2012. Colonoscopy detected 512 CRC patients, 319 colonic cancer and 193 rectal cancer. Extra colonic malignancies were detected in 177 patients, 689 had adenomas of which 399 were high-risk, 1,342 had nonneoplastic bowell disease and 1,978 subjects had 'clean' colorectum. Univariable analysis demonstrated that all biomarkers were statistically significant. Multivariate logistic regression demonstrated that the blood-based biomarkers in combination significantly predicted the endpoints. The reduced model resulted in the selection of CEA, hs-CRP, CyFra21-1 and Ferritin for the two endpoints; AUCs were 0.76 and 0.84, respectively. The postive predictive value at 90% sensitivity was 25% for endpoint 1 and the negative predictive value was 93%. For endpoint 2, the postive predictive value was 18% and the negative predictive value was 97%. Combinations of serological protein biomarkers provided a significant identification of subjects with high risk of the presence of colorectal neoplasia. The present set of biomarkers could become important adjunct in early detection of CRC. © 2016 UICC.

  7. Methods for Assessment of Memory Reactivation.

    PubMed

    Liu, Shizhao; Grosmark, Andres D; Chen, Zhe

    2018-04-13

    It has been suggested that reactivation of previously acquired experiences or stored information in declarative memories in the hippocampus and neocortex contributes to memory consolidation and learning. Understanding memory consolidation depends crucially on the development of robust statistical methods for assessing memory reactivation. To date, several statistical methods have seen established for assessing memory reactivation based on bursts of ensemble neural spike activity during offline states. Using population-decoding methods, we propose a new statistical metric, the weighted distance correlation, to assess hippocampal memory reactivation (i.e., spatial memory replay) during quiet wakefulness and slow-wave sleep. The new metric can be combined with an unsupervised population decoding analysis, which is invariant to latent state labeling and allows us to detect statistical dependency beyond linearity in memory traces. We validate the new metric using two rat hippocampal recordings in spatial navigation tasks. Our proposed analysis framework may have a broader impact on assessing memory reactivations in other brain regions under different behavioral tasks.

  8. Detection rates of the MODIS active fire product in the United States

    USGS Publications Warehouse

    Hawbaker, T.J.; Radeloff, V.C.; Syphard, A.D.; Zhu, Z.; Stewart, S.I.

    2008-01-01

    MODIS active fire data offer new information about global fire patterns. However, uncertainties in detection rates can render satellite-derived fire statistics difficult to interpret. We evaluated the MODIS 1??km daily active fire product to quantify detection rates for both Terra and Aqua MODIS sensors, examined how cloud cover and fire size affected detection rates, and estimated how detection rates varied across the United States. MODIS active fire detections were compared to 361 reference fires (??? 18??ha) that had been delineated using pre- and post-fire Landsat imagery. Reference fires were considered detected if at least one MODIS active fire pixel occurred within 1??km of the edge of the fire. When active fire data from both Aqua and Terra were combined, 82% of all reference fires were found, but detection rates were less for Aqua and Terra individually (73% and 66% respectively). Fires not detected generally had more cloudy days, but not when the Aqua data were considered exclusively. MODIS detection rates decreased with fire size, and the size at which 50% of all fires were detected was 105??ha when combining Aqua and Terra (195??ha for Aqua and 334??ha for Terra alone). Across the United States, detection rates were greatest in the West, lower in the Great Plains, and lowest in the East. The MODIS active fire product captures large fires in the U.S. well, but may under-represent fires in areas with frequent cloud cover or rapidly burning, small, and low-intensity fires. We recommend that users of the MODIS active fire data perform individual validations to ensure that all relevant fires are included. ?? 2008 Elsevier Inc. All rights reserved.

  9. Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    USGS Publications Warehouse

    Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.

    2015-01-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  10. Normal Distribution of CD8+ T-Cell-Derived ELISPOT Counts within Replicates Justifies the Reliance on Parametric Statistics for Identifying Positive Responses.

    PubMed

    Karulin, Alexey Y; Caspell, Richard; Dittrich, Marcus; Lehmann, Paul V

    2015-03-02

    Accurate assessment of positive ELISPOT responses for low frequencies of antigen-specific T-cells is controversial. In particular, it is still unknown whether ELISPOT counts within replicate wells follow a theoretical distribution function, and thus whether high power parametric statistics can be used to discriminate between positive and negative wells. We studied experimental distributions of spot counts for up to 120 replicate wells of IFN-γ production by CD8+ T-cell responding to EBV LMP2A (426 - 434) peptide in human PBMC. The cells were tested in serial dilutions covering a wide range of average spot counts per condition, from just a few to hundreds of spots per well. Statistical analysis of the data using diagnostic Q-Q plots and the Shapiro-Wilk normality test showed that in the entire dynamic range of ELISPOT spot counts within replicate wells followed a normal distribution. This result implies that the Student t-Test and ANOVA are suited to identify positive responses. We also show experimentally that borderline responses can be reliably detected by involving more replicate wells, plating higher numbers of PBMC, addition of IL-7, or a combination of these. Furthermore, we have experimentally verified that the number of replicates needed for detection of weak responses can be calculated using parametric statistics.

  11. Alternative indices of glucose homeostasis as biochemical diagnostic tests for abnormal glucose tolerance in an African setting.

    PubMed

    Kengne, Andre Pascal; Erasmus, Rajiv T; Levitt, Naomi S; Matsha, Tandi E

    2017-04-01

    Accurate diabetes diagnosis is important in Africa, where rates are increasing, and the disease largely undiagnosed. The cumbersome oral glucose tolerance test (OGTT) remains the reference standard, while alternative diagnostic methods are not yet established in Africans. We assessed the ability of fasting plasma glucose (FPG), HbA1c and fructosamine, to diagnose OGTT-based abnormal glucose tolerance in mixed-ancestry South Africans. Mixed-ancestry adults, residing in Cape Town were examined between February and November 2015. OGTT values were used to classify glucose tolerance status as: screen-detected diabetes, prediabetes, dysglycaemia (combination of diabetes and prediabetes) and normal glucose tolerance. Of the 793 participants included, 65 (8.2%) had screen-detected diabetes, 157 (19.8%) prediabetes and 571 (72.0%) normal glucose tolerance. Correlations of FPG and 2-h glucose with HbA1c (r=0.51 and 0.52) were higher than those with fructosamine (0.34 and 0.30), both p<0.0001. The highest c-statistic for the prediction of abnormal glucose tolerance was recorded with 2-h glucose [c-statistic=0.997 (screen-detected diabetes), 0.979 (prediabetes) and 0.984 (dysglycaemia)] and the lowest with fructosamine (0.865, 0.596 and 0.677). At recommended or data-specific optimal cut-offs, no combination of FPG, HbA1c and fructosamine did better than 2-h glucose, while FPG was better than HbA1c and fructosamine on a range of performance measures. Abnormal glucose tolerance in this population is overwhelmingly expressed through 2-h glucose's abnormalities; and no combination of FPG, HbA1c and fructosamine was effective at accurately discriminating OGTT-defined abnormal glucose tolerance. Tested non-glucose based strategies are unreliable alternatives to OGTT for dysglycaemia diagnosis in this population. Copyright © 2017 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.

  12. The statistical properties of vortex flows in the solar atmosphere

    NASA Astrophysics Data System (ADS)

    Wedemeyer, Sven; Kato, Yoshiaki; Steiner, Oskar

    2015-08-01

    Rotating magnetic field structures associated with vortex flows on the Sun, also known as “magnetic tornadoes”, may serve as waveguides for MHD waves and transport mass and energy upwards through the atmosphere. Magnetic tornadoes may therefore potentially contribute to the heating of the upper atmospheric layers in quiet Sun regions.Magnetic tornadoes are observed over a large range of spatial and temporal scales in different layers in quiet Sun regions. However, their statistical properties such as size, lifetime, and rotation speed are not well understood yet because observations of these small-scale events are technically challenging and limited by the spatial and temporal resolution of current instruments. Better statistics based on a combination of high-resolution observations and state-of-the-art numerical simulations is the key to a reliable estimate of the energy input in the lower layers and of the energy deposition in the upper layers. For this purpose, we have developed a fast and reliable tool for the determination and visualization of the flow field in (observed) image sequences. This technique, which combines local correlation tracking (LCT) and line integral convolution (LIC), facilitates the detection and study of dynamic events on small scales, such as propagating waves. Here, we present statistical properties of vortex flows in different layers of the solar atmosphere and try to give realistic estimates of the energy flux which is potentially available for heating of the upper solar atmosphere

  13. Efficient strategy for detecting gene × gene joint action and its application in schizophrenia.

    PubMed

    Won, Sungho; Kwon, Min-Seok; Mattheisen, Manuel; Park, Suyeon; Park, Changsoon; Kihara, Daisuke; Cichon, Sven; Ophoff, Roel; Nöthen, Markus M; Rietschel, Marcella; Baur, Max; Uitterlinden, Andre G; Hofmann, A; Lange, Christoph

    2014-01-01

    We propose a new approach to detect gene × gene joint action in genome-wide association studies (GWASs) for case-control designs. This approach offers an exhaustive search for all two-way joint action (including, as a special case, single gene action) that is computationally feasible at the genome-wide level and has reasonable statistical power under most genetic models. We found that the presence of any gene × gene joint action may imply differences in three types of genetic components: the minor allele frequencies and the amounts of Hardy-Weinberg disequilibrium may differ between cases and controls, and between the two genetic loci the degree of linkage disequilibrium may differ between cases and controls. Using Fisher's method, it is possible to combine the different sources of genetic information in an overall test for detecting gene × gene joint action. The proposed statistical analysis is efficient and its simplicity makes it applicable to GWASs. In the current study, we applied the proposed approach to a GWAS on schizophrenia and found several potential gene × gene interactions. Our application illustrates the practical advantage of the proposed method. © 2013 WILEY PERIODICALS, INC.

  14. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  15. Detection of the kinematic Sunyaev–Zel'dovich effect with DES Year 1 and SPT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soergel, B.; Flender, S.; Story, K. T.

    Here, we detect the kinematic Sunyaev-Zel'dovich (kSZ) effect with a statistical significance ofmore » $$4.2 \\sigma$$ by combining a cluster catalogue derived from the first year data of the Dark Energy Survey (DES) with CMB temperature maps from the South Pole Telescope Sunyaev-Zel'dovich (SPT-SZ) Survey. This measurement is performed with a differential statistic that isolates the pairwise kSZ signal, providing the first detection of the large-scale, pairwise motion of clusters using redshifts derived from photometric data. By fitting the pairwise kSZ signal to a theoretical template we measure the average central optical depth of the cluster sample, $$\\bar{\\tau}_e = (3.75 \\pm 0.89)\\cdot 10^{-3}$$. We compare the extracted signal to realistic simulations and find good agreement with respect to the signal-to-noise, the constraint on $$\\bar{\\tau}_e$$, and the corresponding gas fraction. High-precision measurements of the pairwise kSZ signal with future data will be able to place constraints on the baryonic physics of galaxy clusters, and could be used to probe gravity on scales $$ \\gtrsim 100$$ Mpc.« less

  16. Detection of the kinematic Sunyaev–Zel'dovich effect with DES Year 1 and SPT

    DOE PAGES

    Soergel, B.; Flender, S.; Story, K. T.; ...

    2016-06-17

    Here, we detect the kinematic Sunyaev-Zel'dovich (kSZ) effect with a statistical significance ofmore » $$4.2 \\sigma$$ by combining a cluster catalogue derived from the first year data of the Dark Energy Survey (DES) with CMB temperature maps from the South Pole Telescope Sunyaev-Zel'dovich (SPT-SZ) Survey. This measurement is performed with a differential statistic that isolates the pairwise kSZ signal, providing the first detection of the large-scale, pairwise motion of clusters using redshifts derived from photometric data. By fitting the pairwise kSZ signal to a theoretical template we measure the average central optical depth of the cluster sample, $$\\bar{\\tau}_e = (3.75 \\pm 0.89)\\cdot 10^{-3}$$. We compare the extracted signal to realistic simulations and find good agreement with respect to the signal-to-noise, the constraint on $$\\bar{\\tau}_e$$, and the corresponding gas fraction. High-precision measurements of the pairwise kSZ signal with future data will be able to place constraints on the baryonic physics of galaxy clusters, and could be used to probe gravity on scales $$ \\gtrsim 100$$ Mpc.« less

  17. Which Statistic Should Be Used to Detect Item Preknowledge When the Set of Compromised Items Is Known?

    PubMed

    Sinharay, Sandip

    2017-09-01

    Benefiting from item preknowledge is a major type of fraudulent behavior during educational assessments. Belov suggested the posterior shift statistic for detection of item preknowledge and showed its performance to be better on average than that of seven other statistics for detection of item preknowledge for a known set of compromised items. Sinharay suggested a statistic based on the likelihood ratio test for detection of item preknowledge; the advantage of the statistic is that its null distribution is known. Results from simulated and real data and adaptive and nonadaptive tests are used to demonstrate that the Type I error rate and power of the statistic based on the likelihood ratio test are very similar to those of the posterior shift statistic. Thus, the statistic based on the likelihood ratio test appears promising in detecting item preknowledge when the set of compromised items is known.

  18. DASS: efficient discovery and p-value calculation of substructures in unordered data.

    PubMed

    Hollunder, Jens; Friedel, Maik; Beyer, Andreas; Workman, Christopher T; Wilhelm, Thomas

    2007-01-01

    Pattern identification in biological sequence data is one of the main objectives of bioinformatics research. However, few methods are available for detecting patterns (substructures) in unordered datasets. Data mining algorithms mainly developed outside the realm of bioinformatics have been adapted for that purpose, but typically do not determine the statistical significance of the identified patterns. Moreover, these algorithms do not exploit the often modular structure of biological data. We present the algorithm DASS (Discovery of All Significant Substructures) that first identifies all substructures in unordered data (DASS(Sub)) in a manner that is especially efficient for modular data. In addition, DASS calculates the statistical significance of the identified substructures, for sets with at most one element of each type (DASS(P(set))), or for sets with multiple occurrence of elements (DASS(P(mset))). The power and versatility of DASS is demonstrated by four examples: combinations of protein domains in multi-domain proteins, combinations of proteins in protein complexes (protein subcomplexes), combinations of transcription factor target sites in promoter regions and evolutionarily conserved protein interaction subnetworks. The program code and additional data are available at http://www.fli-leibniz.de/tsb/DASS

  19. Procedures for determination of detection limits: application to high-performance liquid chromatography analysis of fat-soluble vitamins in human serum.

    PubMed

    Browne, Richard W; Whitcomb, Brian W

    2010-07-01

    Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.

  20. The Objective Identification and Quantification of Interstitial Lung Abnormalities in Smokers.

    PubMed

    Ash, Samuel Y; Harmouche, Rola; Ross, James C; Diaz, Alejandro A; Hunninghake, Gary M; Putman, Rachel K; Onieva, Jorge; Martinez, Fernando J; Choi, Augustine M; Lynch, David A; Hatabu, Hiroto; Rosas, Ivan O; Estepar, Raul San Jose; Washko, George R

    2017-08-01

    Previous investigation suggests that visually detected interstitial changes in the lung parenchyma of smokers are highly clinically relevant and predict outcomes, including death. Visual subjective analysis to detect these changes is time-consuming, insensitive to subtle changes, and requires training to enhance reproducibility. Objective detection of such changes could provide a method of disease identification without these limitations. The goal of this study was to develop and test a fully automated image processing tool to objectively identify radiographic features associated with interstitial abnormalities in the computed tomography scans of a large cohort of smokers. An automated tool that uses local histogram analysis combined with distance from the pleural surface was used to detect radiographic features consistent with interstitial lung abnormalities in computed tomography scans from 2257 individuals from the Genetic Epidemiology of COPD study, a longitudinal observational study of smokers. The sensitivity and specificity of this tool was determined based on its ability to detect the visually identified presence of these abnormalities. The tool had a sensitivity of 87.8% and a specificity of 57.5% for the detection of interstitial lung abnormalities, with a c-statistic of 0.82, and was 100% sensitive and 56.7% specific for the detection of the visual subtype of interstitial abnormalities called fibrotic parenchymal abnormalities, with a c-statistic of 0.89. In smokers, a fully automated image processing tool is able to identify those individuals who have interstitial lung abnormalities with moderate sensitivity and specificity. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  1. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    NASA Astrophysics Data System (ADS)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  2. Multi-stage learning for robust lung segmentation in challenging CT volumes.

    PubMed

    Sofka, Michal; Wetzl, Jens; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Kaftan, Jens; Declerck, Jérôme; Zhou, S Kevin

    2011-01-01

    Simple algorithms for segmenting healthy lung parenchyma in CT are unable to deal with high density tissue common in pulmonary diseases. To overcome this problem, we propose a multi-stage learning-based approach that combines anatomical information to predict an initialization of a statistical shape model of the lungs. The initialization first detects the carina of the trachea, and uses this to detect a set of automatically selected stable landmarks on regions near the lung (e.g., ribs, spine). These landmarks are used to align the shape model, which is then refined through boundary detection to obtain fine-grained segmentation. Robustness is obtained through hierarchical use of discriminative classifiers that are trained on a range of manually annotated data of diseased and healthy lungs. We demonstrate fast detection (35s per volume on average) and segmentation of 2 mm accuracy on challenging data.

  3. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. RAId_DbS: Peptide Identification using Database Searches with Realistic Statistics

    PubMed Central

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2007-01-01

    Background The key to mass-spectrometry-based proteomics is peptide identification. A major challenge in peptide identification is to obtain realistic E-values when assigning statistical significance to candidate peptides. Results Using a simple scoring scheme, we propose a database search method with theoretically characterized statistics. Taking into account possible skewness in the random variable distribution and the effect of finite sampling, we provide a theoretical derivation for the tail of the score distribution. For every experimental spectrum examined, we collect the scores of peptides in the database, and find good agreement between the collected score statistics and our theoretical distribution. Using Student's t-tests, we quantify the degree of agreement between the theoretical distribution and the score statistics collected. The T-tests may be used to measure the reliability of reported statistics. When combined with reported P-value for a peptide hit using a score distribution model, this new measure prevents exaggerated statistics. Another feature of RAId_DbS is its capability of detecting multiple co-eluted peptides. The peptide identification performance and statistical accuracy of RAId_DbS are assessed and compared with several other search tools. The executables and data related to RAId_DbS are freely available upon request. PMID:17961253

  5. Order statistics applied to the most massive and most distant galaxy clusters

    NASA Astrophysics Data System (ADS)

    Waizmann, J.-C.; Ettori, S.; Bartelmann, M.

    2013-06-01

    In this work, we present an analytic framework for calculating the individual and joint distributions of the nth most massive or nth highest redshift galaxy cluster for a given survey characteristic allowing us to formulate Λ cold dark matter (ΛCDM) exclusion criteria. We show that the cumulative distribution functions steepen with increasing order, giving them a higher constraining power with respect to the extreme value statistics. Additionally, we find that the order statistics in mass (being dominated by clusters at lower redshifts) is sensitive to the matter density and the normalization of the matter fluctuations, whereas the order statistics in redshift is particularly sensitive to the geometric evolution of the Universe. For a fixed cosmology, both order statistics are efficient probes of the functional shape of the mass function at the high-mass end. To allow a quick assessment of both order statistics, we provide fits as a function of the survey area that allow percentile estimation with an accuracy better than 2 per cent. Furthermore, we discuss the joint distributions in the two-dimensional case and find that for the combination of the largest and the second largest observation, it is most likely to find them to be realized with similar values with a broadly peaked distribution. When combining the largest observation with higher orders, it is more likely to find a larger gap between the observations and when combining higher orders in general, the joint probability density function peaks more strongly. Having introduced the theory, we apply the order statistical analysis to the Southpole Telescope (SPT) massive cluster sample and metacatalogue of X-ray detected clusters of galaxies catalogue and find that the 10 most massive clusters in the sample are consistent with ΛCDM and the Tinker mass function. For the order statistics in redshift, we find a discrepancy between the data and the theoretical distributions, which could in principle indicate a deviation from the standard cosmology. However, we attribute this deviation to the uncertainty in the modelling of the SPT survey selection function. In turn, by assuming the ΛCDM reference cosmology, order statistics can also be utilized for consistency checks of the completeness of the observed sample and of the modelling of the survey selection function.

  6. In-situ determination of metallic variation and multi-association in single particles by combining synchrotron microprobe, sequential chemical extraction and multivariate statistical analysis.

    PubMed

    Zhu, Yu-Min; Zhang, Hua; Fan, Shi-Suo; Wang, Si-Jia; Xia, Yi; Shao, Li-Ming; He, Pin-Jing

    2014-07-15

    Due to the heterogeneity of metal distribution, it is challenging to identify the speciation, source and fate of metals in solid samples at micro scales. To overcome these challenges single particles of air pollution control residues were detected in situ by synchrotron microprobe after each step of chemical extraction and analyzed by multivariate statistical analysis. Results showed that Pb, Cu and Zn co-existed as acid soluble fractions during chemical extraction, regardless of their individual distribution as chlorides or oxides in the raw particles. Besides the forms of Fe2O3, MnO2 and FeCr2O4, Fe, Mn, Cr and Ni were closely associated with each other, mainly as reducible fractions. In addition, the two groups of metals had interrelations with the Si-containing insoluble matrix. The binding could not be directly detected by micro-X-ray diffraction (μ-XRD) and XRD, suggesting their partial existence as amorphous forms or in the solid solution. The combined method on single particles can effectively determine metallic multi-associations and various extraction behaviors that could not be identified by XRD, μ-XRD or X-ray absorption spectroscopy. The results are useful for further source identification and migration tracing of heavy metals. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. [Application of a mathematical algorithm for the detection of electroneuromyographic results in the pathogenesis study of facial dyskinesia].

    PubMed

    Gribova, N P; Iudel'son, Ia B; Golubev, V L; Abramenkova, I V

    2003-01-01

    To carry out a differential diagnosis of two facial dyskinesia (FD) models--facial hemispasm (FH) and facial paraspasm (FP), a combined program of electroneuromyographic (ENMG) examination has been created, using statistical analyses, including that for objects identification based on hybrid neural network with the application of adaptive fuzzy logic method and standard statistics programs (Wilcoxon, Student statistics). In FH, a lesion of peripheral facial neuromotor apparatus with augmentation of functions of inter-neurons in segmental and upper segmental stem levels predominated. In FP, primary afferent strengthening in mimic muscles was accompanied by increased motor neurons activity and reciprocal augmentation of inter-neurons, inhibiting motor portion of V pair. Mathematical algorithm for ENMG results recognition worked out in the study provides a precise differentiation of two FD models and opens possibilities for differential diagnosis of other facial motor disorders.

  8. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  9. Application of advanced cytometric and molecular technologies to minimal residual disease monitoring

    NASA Astrophysics Data System (ADS)

    Leary, James F.; He, Feng; Reece, Lisa M.

    2000-04-01

    Minimal residual disease monitoring presents a number of theoretical and practical challenges. Recently it has been possible to meet some of these challenges by combining a number of new advanced biotechnologies. To monitor the number of residual tumor cells requires complex cocktails of molecular probes that collectively provide sensitivities of detection on the order of one residual tumor cell per million total cells. Ultra-high-speed, multi parameter flow cytometry is capable of analyzing cells at rates in excess of 100,000 cells/sec. Residual tumor selection marker cocktails can be optimized by use of receiver operating characteristic analysis. New data minimizing techniques when combined with multi variate statistical or neural network classifications of tumor cells can more accurately predict residual tumor cell frequencies. The combination of these techniques can, under at least some circumstances, detect frequencies of tumor cells as low as one cell in a million with an accuracy of over 98 percent correct classification. Detection of mutations in tumor suppressor genes requires insolation of these rare tumor cells and single-cell DNA sequencing. Rare residual tumor cells can be isolated at single cell level by high-resolution single-cell cell sorting. Molecular characterization of tumor suppressor gene mutations can be accomplished using a combination of single- cell polymerase chain reaction amplification of specific gene sequences followed by TA cloning techniques and DNA sequencing. Mutations as small as a single base pair in a tumor suppressor gene of a single sorted tumor cell have been detected using these methods. Using new amplification procedures and DNA micro arrays it should be possible to extend the capabilities shown in this paper to screening of multiple DNA mutations in tumor suppressor and other genes on small numbers of sorted metastatic tumor cells.

  10. Role of intercellular communications in breast cancer multicellular tumor spheroids after chemotherapy.

    PubMed

    Oktem, G; Bilir, A; Ayla, S; Yavasoglu, A; Goksel, G; Saydam, G; Uysal, A

    2006-01-01

    Tumor heterogeneity is an important feature that is especially involved in tumor aggressiveness. Multicellular tumor spheroids (MTS) may provide some benefits in different steps for investigation of the aggregation, organization, differentiation, and network formation of tumor cells in 3D space. This model offers a unique opportunity for improvements in the capability of a current strategy to detect the effect of an appropriate anticancer agent. The aim of this study was to investigate the cellular interactions and morphological changes following chemotherapy in a 3D breast cancer spheroid model. Distribution of the gap junction protein "connexin-43" and the tight junction protein "occludin" was investigated by immunohistochemistry. Cellular interactions were examined by using transmission and scanning electron microscopies as well as light microscopy with Giemsa staining after treating cells with doxorubicin, docetaxel, and doxorubicin/docetaxel combination. Statistical analyses showed significant changes and various alterations that were observed in all groups; however, the most prominent effect was detected in the doxorubicin/docetaxel combination group. Distinct composition as a vessel-like structure and a pseudoglandular pattern of control spheroids were detected in drug-administered groups. Immunohistochemical results were consistent with the ultrastructural changes. In conclusion, doxorubicin/docetaxel combination may be more effective than the single drug usage as shown in a 3D model. The MTS model has been found to be an appropriate and reliable method for the detection of the changes in the expression of cellular junction proteins as well as other cellular proteins occurring after chemotherapy. The MTS model can be used to validate the effects of various combinations or new chemotherapeutic agents as well as documentation of possible mechanisms of new drugs.

  11. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Objective. Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. Approach. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. Main results. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Significance. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  12. Sampling design considerations for demographic studies: a case of colonial seabirds

    USGS Publications Warehouse

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of most taxa being marked and in some cases have individually been applied to studies of birds, fish, herpetofauna, and mammals.

  13. Identifying significant gene‐environment interactions using a combination of screening testing and hierarchical false discovery rate control

    PubMed Central

    Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.

    2016-01-01

    ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615

  14. Impact of Early and Late Visual Deprivation on the Structure of the Corpus Callosum: A Study Combining Thickness Profile with Surface Tensor-Based Morphometry.

    PubMed

    Shi, Jie; Collignon, Olivier; Xu, Liang; Wang, Gang; Kang, Yue; Leporé, Franco; Lao, Yi; Joshi, Anand A; Leporé, Natasha; Wang, Yalin

    2015-07-01

    Blindness represents a unique model to study how visual experience may shape the development of brain organization. Exploring how the structure of the corpus callosum (CC) reorganizes ensuing visual deprivation is of particular interest due to its important functional implication in vision (e.g., via the splenium of the CC). Moreover, comparing early versus late visually deprived individuals has the potential to unravel the existence of a sensitive period for reshaping the CC structure. Here, we develop a novel framework to capture a complete set of shape differences in the CC between congenitally blind (CB), late blind (LB) and sighted control (SC) groups. The CCs were manually segmented from T1-weighted brain MRI and modeled by 3D tetrahedral meshes. We statistically compared the combination of local area and thickness at each point between subject groups. Differences in area are found using surface tensor-based morphometry; thickness is estimated by tracing the streamlines in the volumetric harmonic field. Group differences were assessed on this combined measure using Hotelling's T(2) test. Interestingly, we observed that the total callosal volume did not differ between the groups. However, our fine-grained analysis reveals significant differences mostly localized around the splenium areas between both blind groups and the sighted group (general effects of blindness) and, importantly, specific dissimilarities between the LB and CB groups, illustrating the existence of a sensitive period for reorganization. The new multivariate statistics also gave better effect sizes for detecting morphometric differences, relative to other statistics. They may boost statistical power for CC morphometric analyses.

  15. IMPACT OF EARLY AND LATE VISUAL DEPRIVATION ON THE STRUCTURE OF THE CORPUS CALLOSUM: A STUDY COMBINING THICKNESS PROFILE WITH SURFACE TENSOR-BASED MORPHOMETRY

    PubMed Central

    Shi, Jie; Collignon, Olivier; Xu, Liang; Wang, Gang; Kang, Yue; Leporé, Franco; Lao, Yi; Joshi, Anand A.

    2015-01-01

    Blindness represents a unique model to study how visual experience may shape the development of brain organization. Exploring how the structure of the corpus callosum (CC) reorganizes ensuing visual deprivation is of particular interest due to its important functional implication in vision (e.g. via the splenium of the CC). Moreover, comparing early versus late visually deprived individuals has the potential to unravel the existence of a sensitive period for reshaping the CC structure. Here, we develop a novel framework to capture a complete set of shape differences in the CC between congenitally blind (CB), late blind (LB) and sighted control (SC) groups. The CCs were manually segmented from T1-weighted brain MRI and modeled by 3D tetrahedral meshes. We statistically compared the combination of local area and thickness at each point between subject groups. Differences in area are found using surface tensor-based morphometry; thickness is estimated by tracing the streamlines in the volumetric harmonic field. Group differences were assessed on this combined measure using Hotelling’s T2 test. Interestingly, we observed that the total callosal volume did not differ between the groups. However, our fine-grained analysis reveals significant differences mostly localized around the splenium areas between both blind groups and the sighted group (general effects of blindness) and, importantly, specific dissimilarities between the LB and CB groups, illustrating the existence of a sensitive period for reorganization. The new multivariate statistics also gave better effect sizes for detecting morphometric differences, relative to other statistics. They may boost statistical power for CC morphometric analyses. PMID:25649876

  16. C-band Joint Active/Passive Dual Polarization Sea Ice Detection

    NASA Astrophysics Data System (ADS)

    Keller, M. R.; Gifford, C. M.; Winstead, N. S.; Walton, W. C.; Dietz, J. E.

    2017-12-01

    A technique for synergistically-combining high-resolution SAR returns with like-frequency passive microwave emissions to detect thin (<30 cm) ice under the difficult conditions of late melt and freeze-up is presented. As the Arctic sea ice cover thins and shrinks, the algorithm offers an approach to adapting existing sensors monitoring thicker ice to provide continuing coverage. Lower resolution (10-26 km) ice detections with spaceborne radiometers and scatterometers are challenged by rapidly changing thin ice. Synthetic Aperture Radar (SAR) is high resolution (5-100m) but because of cross section ambiguities automated algorithms have had difficulty separating thin ice types from water. The radiometric emissivity of thin ice versus water at microwave frequencies is generally unambiguous in the early stages of ice growth. The method, developed using RADARSAT-2 and AMSR-E data, uses higher-ordered statistics. For the SAR, the COV (coefficient of variation, ratio of standard deviation to mean) has fewer ambiguities between ice and water than cross sections, but breaking waves still produce ice-like signatures for both polarizations. For the radiometer, the PRIC (polarization ratio ice concentration) identifies areas that are unambiguously water. Applying cumulative statistics to co-located COV levels adaptively determines an ice/water threshold. Outcomes from extensive testing with Sentinel and AMSR-2 data are shown in the results. The detection algorithm was applied to the freeze-up in the Beaufort, Chukchi, Barents, and East Siberian Seas in 2015 and 2016, spanning mid-September to early November of both years. At the end of the melt, 6 GHz PRIC values are 5-10% greater than those reported by radiometric algorithms at 19 and 37 GHz. During freeze-up, COV separates grease ice (<5 cm thick) from water. As the ice thickens, the COV is less reliable, but adding a mask based on either the PRIC or the cross-pol/co-pol SAR ratio corrects for COV deficiencies. In general, the dual-sensor detection algorithm reports 10-15% higher total ice concentrations than operational scatterometer or radiometer algorithms, mostly from ice edge and coastal areas. In conclusion, the algorithm presented combines high-resolution SAR returns with passive microwave emissions for automated ice detection at SAR resolutions.

  17. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  18. Robust Combining of Disparate Classifiers Through Order Statistics

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  19. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  20. Deep belief networks for false alarm rejection in forward-looking ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Becker, John; Havens, Timothy C.; Pinar, Anthony; Schulz, Timothy J.

    2015-05-01

    Explosive hazards are one of the most deadly threats in modern conflicts. The U.S. Army is interested in a reliable way to detect these hazards at range. A promising way of accomplishing this task is using a forward-looking ground-penetrating radar (FLGPR) system. Recently, the Army has been testing a system that utilizes both L-band and X-band radar arrays on a vehicle mounted platform. Using data from this system, we sought to improve the performance of a constant false-alarm-rate (CFAR) prescreener through the use of a deep belief network (DBN). DBNs have also been shown to perform exceptionally well at generalized anomaly detection. They combine unsupervised pre-training with supervised fine-tuning to generate low-dimensional representations of high-dimensional input data. We seek to take advantage of these two properties by training a DBN on the features of the CFAR prescreener's false alarms (FAs) and then use that DBN to separate FAs from true positives. Our analysis shows that this method improves the detection statistics significantly. By training the DBN on a combination of image features, we were able to significantly increase the probability of detection while maintaining a nominal number of false alarms per square meter. Our research shows that DBNs are a good candidate for improving detection rates in FLGPR systems.

  1. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    NASA Technical Reports Server (NTRS)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  2. Parameterisation of non-homogeneities in buried object detection by means of thermography

    NASA Astrophysics Data System (ADS)

    Stepanić, Josip; Malinovec, Marina; Švaić, Srećko; Krstelj, Vjera

    2004-05-01

    Landmines and their natural environment form a system of complex dynamics with variable characteristics. A manifestation of that complexity within the context of thermography-based landmines detection is excessive noise in thermograms. That has severely suppressed application of thermography in landmines detection for the purposes of humanitarian demining. (To be differentiated from military demining and demining for military operations other than war [Land Mine Detection DOD's Research Program Needs a Comprehensive Evaluation Strategy, US GAO Report, GAO-01 239, 2001; International Mine Action Standards, Chapter 4.--Glossary. Available at: < http://www.mineactionstandards.org/IMAS_archive/Final/04.10.pdf>].) The discrepancy between the existing role and the actual potential of thermography in humanitarian demining motivated systematic approach to sources of noise in thermograms of buried objects. These sources are variations in mine orientation relative to soil normal, which modify the shape of mine signature on thermograms, as well as non-homogeneities in soil and vegetation layer above the mine, which modify the overall quality of thermograms. This paper analyses the influence of variable mines, and more generally the influence of axially symmetric buried object orientation on the quality of its signature on thermograms. The following two angles have been extracted to serve as parameters describing variation in orientation: (i) θ--angle between the local vertical axis and mine symmetry axis and (ii) ψ--angle between local vertical axis and soil surface normal. Their influence is compared to the influence of (iii) d--the object depth change, which serves as control parameter. The influences are quantified and ranked within a statistically planned experiment. The analysis has proved that among the parameters listed, the most influential one is statistical interaction dψ, followed with the statistical interaction dθ. According to statistical tests, these two combinations are considered the most significant influences. The results show that the currently applied analysis of thermography in humanitarian demining must be broadened by the inclusion of the variations in mine orientation, otherwise a decrease in the probability of mine detection, due to the presence of a systematic error, occurs.

  3. Clairvoyant fusion: a new methodology for designing robust detection algorithms

    NASA Astrophysics Data System (ADS)

    Schaum, Alan

    2016-10-01

    Many realistic detection problems cannot be solved with simple statistical tests for known alternative probability models. Uncontrollable environmental conditions, imperfect sensors, and other uncertainties transform simple detection problems with likelihood ratio solutions into composite hypothesis (CH) testing problems. Recently many multi- and hyperspectral sensing CH problems have been addressed with a new approach. Clairvoyant fusion (CF) integrates the optimal detectors ("clairvoyants") associated with every unspecified value of the parameters appearing in a detection model. For problems with discrete parameter values, logical rules emerge for combining the decisions of the associated clairvoyants. For many problems with continuous parameters, analytic methods of CF have been found that produce closed-form solutions-or approximations for intractable problems. Here the principals of CF are reviewed and mathematical insights are described that have proven useful in the derivation of solutions. It is also shown how a second-stage fusion procedure can be used to create theoretically superior detection algorithms for ALL discrete parameter problems.

  4. Incorporating thyroid markers in Down syndrome screening protocols.

    PubMed

    Dhaifalah, Ishraq; Salek, Tomas; Langova, Dagmar; Cuckle, Howard

    2017-05-01

    The article aimed to assess the benefit of incorporating maternal serum thyroid disease marker levels (thyroid-stimulating hormone and free thyroxine) into first trimester Down syndrome screening protocols. Statistical modelling was used to predict performance with and without the thyroid markers. Two protocols were considered: the combined test and the contingent cell-free DNA (cfDNA) test, where 15-40% women are selected for cfDNA because of increased risk based on combined test results. Published parameters were used for the combined test, cfDNA and the Down syndrome means for thyroid-stimulating hormone and free thyroxine; other parameters were derived from a series of 5230 women screened for both thyroid disease and Down syndrome. Combined test: For a fixed 85% detection rate, the predicted false positive rate was reduced from 5.3% to 3.6% with the addition of the thyroid markers. Contingent cfDNA test: For a fixed 95% detection rate, the proportion of women selected for cfDNA was reduced from 25.6% to 20.2%. When screening simultaneously for maternal thyroid disease and Down syndrome, thyroid marker levels should be used in the calculation of Down syndrome risk. The benefit is modest but can be achieved with no additional cost. © 2017 John Wiley & Sons, Ltd. © 2017 John Wiley & Sons, Ltd.

  5. Meta-analysis of the efficacy and safety of combined surgery in the management of eyes with coexisting cataract and open angle glaucoma.

    PubMed

    Jiang, Nan; Zhao, Gui-Qiu; Lin, Jing; Hu, Li-Ting; Che, Cheng-Ye; Wang, Qian; Xu, Qiang; Li, Cui; Zhang, Jie

    2018-01-01

    To conduct a systematic review and quantitative Meta-analysis of the efficacy and safety of combined surgery for the eyes with coexisting cataract and open angle glaucoma. We performed a systematic search of the related literature in the Cochrane Library, PubMed, EMBASE, Web of Science databases, CNKI, CBM and Wan Fang databases, with no limitations on language or publication date. The primary efficacy estimate was identified by weighted mean difference of the percentage of intraocular pressure reduction (IOPR%) from baseline to end-point, the percentage of number of glaucoma medications reduction from pre- to post-operation, and the secondary efficacy evaluations were performed by odds ratio (OR) and 95% confidence interval (CI) for complete and qualified success rate. Besides, ORs were applied to assess the tolerability of adverse incidents. Meta-analyses of fixed or random effect models were performed using RevMan software 5.2 to gather the consequences. Heterogeneity was evaluated by Chi 2 test and the I 2 measure. Ten studies enrolling 3108 patients were included. The combined consequences indicated that both glaucoma and combined cataract and glaucoma surgery significantly decreased IOP. For deep sclerectomy vs deep sclerectomy plus phacoemulsification and canaloplasty vs phaco-canaloplasty, the differences in IOPR% were not all statistically significant while trabeculotomy was detected to gain a quantitatively greater IOPR% compared with trabeculotomy plus phacoemulsification. Furthermore, there was no statistical significance in the complete and qualified success rate, and the rates of adverse incidents for trabeculotomy vs trabeculotomy plus phacoemulsification. Compared with trabeculotomy plus phacoemulsification, trabeculectomy alone is more effective in lowering IOP and the number of glaucoma medications, while the two surgeries can not demonstrate statistical differences in the complete success rate, qualified success rate, or incidence of adverse incidents.

  6. Forecasting neutrino masses from combining KATRIN and the CMB observations: Frequentist and Bayesian analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Host, Ole; Lahav, Ofer; Abdalla, Filipe B.

    We present a showcase for deriving bounds on the neutrino masses from laboratory experiments and cosmological observations. We compare the frequentist and Bayesian bounds on the effective electron neutrino mass m{sub {beta}} which the KATRIN neutrino mass experiment is expected to obtain, using both an analytical likelihood function and Monte Carlo simulations of KATRIN. Assuming a uniform prior in m{sub {beta}}, we find that a null result yields an upper bound of about 0.17 eV at 90% confidence in the Bayesian analysis, to be compared with the frequentist KATRIN reference value of 0.20 eV. This is a significant difference whenmore » judged relative to the systematic and statistical uncertainties of the experiment. On the other hand, an input m{sub {beta}}=0.35 eV, which is the KATRIN 5{sigma} detection threshold, would be detected at virtually the same level. Finally, we combine the simulated KATRIN results with cosmological data in the form of present (post-WMAP) and future (simulated Planck) observations. If an input of m{sub {beta}}=0.2 eV is assumed in our simulations, KATRIN alone excludes a zero neutrino mass at 2.2{sigma}. Adding Planck data increases the probability of detection to a median 2.7{sigma}. The analysis highlights the importance of combining cosmological and laboratory data on an equal footing.« less

  7. A robust multifactor dimensionality reduction method for detecting gene-gene interactions with application to the genetic analysis of bladder cancer susceptibility

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    A central goal of human genetics is to identify and characterize susceptibility genes for common complex human diseases. An important challenge in this endeavor is the modeling of gene-gene interaction or epistasis that can result in non-additivity of genetic effects. The multifactor dimensionality reduction (MDR) method was developed as machine learning alternative to parametric logistic regression for detecting interactions in absence of significant marginal effects. The goal of MDR is to reduce the dimensionality inherent in modeling combinations of polymorphisms using a computational approach called constructive induction. Here, we propose a Robust Multifactor Dimensionality Reduction (RMDR) method that performs constructive induction using a Fisher’s Exact Test rather than a predetermined threshold. The advantage of this approach is that only those genotype combinations that are determined to be statistically significant are considered in the MDR analysis. We use two simulation studies to demonstrate that this approach will increase the success rate of MDR when there are only a few genotype combinations that are significantly associated with case-control status. We show that there is no loss of success rate when this is not the case. We then apply the RMDR method to the detection of gene-gene interactions in genotype data from a population-based study of bladder cancer in New Hampshire. PMID:21091664

  8. Older Adults’ Functional Performance and Health Knowledge After a Combination Exercise, Health Education, and Bingo Game

    PubMed Central

    Crandall, K. Jason; Steenbergen, Katryn I.

    2015-01-01

    Combining exercise, health education, and the game of bingo may help older adults remain independent. The objective was to determine whether a 10-week health promotion program (Bingocize®) improves functional performance and health knowledge in older adults. Participants were assigned to experimental (n = 13) or control (n = 14) groups. The intervention was administered twice per week at two independent living facilities. Pre and postfunctional performance and health knowledge were measured. Mixed between–within subject ANOVA was used to detect differences between groups (p < .05). Improvements were found in all dependent variables except lower body flexibility, systolic blood pressure, and health knowledge. Adherence was 97.31% ± 2.59%. Bingocize® has the potential to help older adults remain independent by improving functional performance. Statistical improvements in health knowledge were not found, but future researchers may explore modifying the health education component or using a different measure of health knowledge to detect changes. PMID:28138476

  9. TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information

    PubMed Central

    Struck, Torsten H

    2014-01-01

    Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118

  10. Detection and classification of retinal lesions for grading of diabetic retinopathy.

    PubMed

    Usman Akram, M; Khalid, Shehzad; Tariq, Anam; Khan, Shoab A; Azam, Farooque

    2014-02-01

    Diabetic Retinopathy (DR) is an eye abnormality in which the human retina is affected due to an increasing amount of insulin in blood. The early detection and diagnosis of DR is vital to save the vision of diabetes patients. The early signs of DR which appear on the surface of the retina are microaneurysms, haemorrhages, and exudates. In this paper, we propose a system consisting of a novel hybrid classifier for the detection of retinal lesions. The proposed system consists of preprocessing, extraction of candidate lesions, feature set formulation, and classification. In preprocessing, the system eliminates background pixels and extracts the blood vessels and optic disc from the digital retinal image. The candidate lesion detection phase extracts, using filter banks, all regions which may possibly have any type of lesion. A feature set based on different descriptors, such as shape, intensity, and statistics, is formulated for each possible candidate region: this further helps in classifying that region. This paper presents an extension of the m-Mediods based modeling approach, and combines it with a Gaussian Mixture Model in an ensemble to form a hybrid classifier to improve the accuracy of the classification. The proposed system is assessed using standard fundus image databases with the help of performance parameters, such as, sensitivity, specificity, accuracy, and the Receiver Operating Characteristics curves for statistical analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. The effect of exercise on venous gas emboli and decompression sickness in human subjects at 4.3 psia

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Waligora, James M.; Horrigan, David J., Jr.; Hadley, Arthur T., III

    1987-01-01

    The contribution of upper body exercise to altitude decompression sickness while at 4.3 psia after 3.5 or 4.0 hours of 100% oxygen prebreathing at 14.7 psia was determined by comparing the incidence and patterns of venous gas emboli (VGE), and the incidence of Type 1 decompression sickness (DCS) in 43 exercising male subjects and 9 less active male Doppler Technicians (DT's). Each subject exercised for 4 minutes at each of 3 exercise stations while at 4.3 psia. An additional 4 minutes were spent monitoring for VGE by the DT while the subject was supine on an examination cot. In the combined 3.5 and 4.0 hour oxygen prebreathe data, 13 subjects complained of Type 1 DCS compared to 9 complaints from DT's. VGE were detected in 28 subjects compared to 14 detections from DT's. A chi-square analysis of proportions showed no statistically significantly difference in the incidence of Type 1 DCS or VGE between the two groups; however, the average time to detect VGE and to report Tyep 1 DCS symptoms were statistically different. It was concluded that 4 to 6 hours of upper body exercise at metabolic rates simulating EVA metabolic rates hastens the initial detection of VGE and the time to report Type 1 DCS symptoms as compared to DT's.

  12. Good initialization model with constrained body structure for scene text recognition

    NASA Astrophysics Data System (ADS)

    Zhu, Anna; Wang, Guoyou; Dong, Yangbo

    2016-09-01

    Scene text recognition has gained significant attention in the computer vision community. Character detection and recognition are the promise of text recognition and affect the overall performance to a large extent. We proposed a good initialization model for scene character recognition from cropped text regions. We use constrained character's body structures with deformable part-based models to detect and recognize characters in various backgrounds. The character's body structures are achieved by an unsupervised discriminative clustering approach followed by a statistical model and a self-build minimum spanning tree model. Our method utilizes part appearance and location information, and combines character detection and recognition in cropped text region together. The evaluation results on the benchmark datasets demonstrate that our proposed scheme outperforms the state-of-the-art methods both on scene character recognition and word recognition aspects.

  13. Detection of calcification clusters in digital breast tomosynthesis slices at different dose levels utilizing a SRSAR reconstruction and JAFROC

    NASA Astrophysics Data System (ADS)

    Timberg, P.; Dustler, M.; Petersson, H.; Tingberg, A.; Zackrisson, S.

    2015-03-01

    Purpose: To investigate detection performance for calcification clusters in reconstructed digital breast tomosynthesis (DBT) slices at different dose levels using a Super Resolution and Statistical Artifact Reduction (SRSAR) reconstruction method. Method: Simulated calcifications with irregular profile (0.2 mm diameter) where combined to form clusters that were added to projection images (1-3 per abnormal image) acquired on a DBT system (Mammomat Inspiration, Siemens). The projection images were dose reduced by software to form 35 abnormal cases and 25 normal cases as if acquired at 100%, 75% and 50% dose level (AGD of approximately 1.6 mGy for a 53 mm standard breast, measured according to EUREF v0.15). A standard FBP and a SRSAR reconstruction method (utilizing IRIS (iterative reconstruction filters), and outlier detection using Maximum-Intensity Projections and Average-Intensity Projections) were used to reconstruct single central slices to be used in a Free-response task (60 images per observer and dose level). Six observers participated and their task was to detect the clusters and assign confidence rating in randomly presented images from the whole image set (balanced by dose level). Each trial was separated by one weeks to reduce possible memory bias. The outcome was analyzed for statistical differences using Jackknifed Alternative Free-response Receiver Operating Characteristics. Results: The results indicate that it is possible reduce the dose by 50% with SRSAR without jeopardizing cluster detection. Conclusions: The detection performance for clusters can be maintained at a lower dose level by using SRSAR reconstruction.

  14. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  15. Fast Identification of Biological Pathways Associated with a Quantitative Trait Using Group Lasso with Overlaps

    PubMed Central

    Silver, Matt; Montana, Giovanni

    2012-01-01

    Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682

  16. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition.

    PubMed

    Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine

    2016-08-18

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  17. Detecting associated single-nucleotide polymorphisms on the X chromosome in case control genome-wide association studies.

    PubMed

    Chen, Zhongxue; Ng, Hon Keung Tony; Li, Jing; Liu, Qingzhong; Huang, Hanwen

    2017-04-01

    In the past decade, hundreds of genome-wide association studies have been conducted to detect the significant single-nucleotide polymorphisms that are associated with certain diseases. However, most of the data from the X chromosome were not analyzed and only a few significant associated single-nucleotide polymorphisms from the X chromosome have been identified from genome-wide association studies. This is mainly due to the lack of powerful statistical tests. In this paper, we propose a novel statistical approach that combines the information of single-nucleotide polymorphisms on the X chromosome from both males and females in an efficient way. The proposed approach avoids the need of making strong assumptions about the underlying genetic models. Our proposed statistical test is a robust method that only makes the assumption that the risk allele is the same for both females and males if the single-nucleotide polymorphism is associated with the disease for both genders. Through simulation study and a real data application, we show that the proposed procedure is robust and have excellent performance compared to existing methods. We expect that many more associated single-nucleotide polymorphisms on the X chromosome will be identified if the proposed approach is applied to current available genome-wide association studies data.

  18. Sensitivity and Specificity of Interictal EEG-fMRI for Detecting the Ictal Onset Zone at Different Statistical Thresholds

    PubMed Central

    Tousseyn, Simon; Dupont, Patrick; Goffin, Karolien; Sunaert, Stefan; Van Paesschen, Wim

    2014-01-01

    There is currently a lack of knowledge about electroencephalography (EEG)-functional magnetic resonance imaging (fMRI) specificity. Our aim was to define sensitivity and specificity of blood oxygen level dependent (BOLD) responses to interictal epileptic spikes during EEG-fMRI for detecting the ictal onset zone (IOZ). We studied 21 refractory focal epilepsy patients who had a well-defined IOZ after a full presurgical evaluation and interictal spikes during EEG-fMRI. Areas of spike-related BOLD changes overlapping the IOZ in patients were considered as true positives; if no overlap was found, they were treated as false-negatives. Matched healthy case-controls had undergone similar EEG-fMRI in order to determine true-negative and false-positive fractions. The spike-related regressor of the patient was used in the design matrix of the healthy case-control. Suprathreshold BOLD changes in the brain of controls were considered as false positives, absence of these changes as true negatives. Sensitivity and specificity were calculated for different statistical thresholds at the voxel level combined with different cluster size thresholds and represented in receiver operating characteristic (ROC)-curves. Additionally, we calculated the ROC-curves based on the cluster containing the maximal significant activation. We achieved a combination of 100% specificity and 62% sensitivity, using a Z-threshold in the interval 3.4–3.5 and cluster size threshold of 350 voxels. We could obtain higher sensitivity at the expense of specificity. Similar performance was found when using the cluster containing the maximal significant activation. Our data provide a guideline for different EEG-fMRI settings with their respective sensitivity and specificity for detecting the IOZ. The unique cluster containing the maximal significant BOLD activation was a sensitive and specific marker of the IOZ. PMID:25101049

  19. Detection of early pancreatic ductal adenocarcinoma using thrombospondin-2 and CA19-9 blood markers

    PubMed Central

    Kim, Jungsun; Bamlet, William R.; Oberg, Ann L.; Chaffee, Kari G.; Donahue, Greg; Cao, Xing-Jun; Chari, Suresh; Garcia, Benjamin A.; Petersen, Gloria M.; Zaret, Kenneth S.

    2017-01-01

    Markers are needed to facilitate early detection of pancreatic ductal adenocarcinoma (PDAC), which is often diagnosed too late for effective therapy. Starting with a PDAC cell reprogramming model that recapitulated the progression of human PDAC, we identified secreted proteins and tested and validated a subset of them as potential markers of PDAC. We optimized an ELISA assay using plasma samples from patients with various stages of PDAC, from individuals with benign pancreatic disease, and from healthy controls. Clinical studies including a phase 1 discovery study (N=20 patients), a phase 2a validation study (N=189), and a second phase 2b validation study (N=537) revealed that concentrations of plasma thrombospondin-2 (THBS2) discriminated among all stages of PDAC consistently over the three studies with a Receiver Operating Characteristic (ROC) c-statistic of 0.76 in Phase 1, 0.842 in Phase 2a, and 0.875 in Phase 2b. The concentration of THBS2 in plasma performed as well at discriminating resectable stage I cancer as stage III/IV PDAC. THBS2 concentrations combined with those for CA19-9, a previously identified PDAC marker, yielded a c-statistic of 0.956 in the Phase 2a study and 0.970 in the Phase 2b study. THBS2 data improved the ability of CA19-9 to distinguish PDAC from pancreatitis. With a specificity of 98%, the combination of THBS2 and CA19-9 yielded a sensitivity of 87% for PDAC in the Phase 2b study. Given this, a THBS2 and CA19-9 panel assessed in human blood using a conventional ELISA assay may improve the detection of patients at high risk for PDAC. PMID:28701476

  20. Analysis of trends in selected streamflow statistics for the Concho River Basin, Texas, 1916-2009

    USGS Publications Warehouse

    Barbie, Dana L.; Wehmeyer, Loren L.; May, Jayne E.

    2012-01-01

    Six U.S. Geological Survey streamflow-gaging stations were selected for analysis. Streamflow-gaging station 08128000 South Concho River at Christoval has downward trends for annual maximum daily discharge and annual instantaneous peak discharge for the combined period 1931-95, 2002-9. Streamflow-gaging station 08128400 Middle Concho River above Tankersley has downward trends for annual maximum daily discharge and annual instantaneous peak discharge for the combined period 1962-95, 2002-9. Streamflow-gaging station 08128500 Middle Concho River near Tankersley has no significant trends in the streamflow statistics considered for the period 1931-60. Streamflow-gaging station 08134000 North Concho River near Carlsbad has downward trends for annual mean daily discharge, annual 7-day minimum daily discharge, annual maximum daily discharge, and annual instantaneous peak discharge for the period 1925-2009. Streamflow-gaging stations 08136000 Concho River at San Angelo and 08136500 Concho River at Paint Rock have downward trends for 1916-2009 for all streamflow statistics calculated, but streamflow-gaging station 08136000 Concho River at San Angelo has an upward trend for annual maximum daily discharge during 1964-2009. The downward trends detected during 1916-2009 for the Concho River at San Angelo are not unexpected because of three reservoirs impounding and profoundly regulating streamflow.

  1. Accuracy Evaluation of the Unified P-Value from Combining Correlated P-Values

    PubMed Central

    Alves, Gelio; Yu, Yi-Kuo

    2014-01-01

    Meta-analysis methods that combine -values into a single unified -value are frequently employed to improve confidence in hypothesis testing. An assumption made by most meta-analysis methods is that the -values to be combined are independent, which may not always be true. To investigate the accuracy of the unified -value from combining correlated -values, we have evaluated a family of statistical methods that combine: independent, weighted independent, correlated, and weighted correlated -values. Statistical accuracy evaluation by combining simulated correlated -values showed that correlation among -values can have a significant effect on the accuracy of the combined -value obtained. Among the statistical methods evaluated those that weight -values compute more accurate combined -values than those that do not. Also, statistical methods that utilize the correlation information have the best performance, producing significantly more accurate combined -values. In our study we have demonstrated that statistical methods that combine -values based on the assumption of independence can produce inaccurate -values when combining correlated -values, even when the -values are only weakly correlated. Therefore, to prevent from drawing false conclusions during hypothesis testing, our study advises caution be used when interpreting the -value obtained from combining -values of unknown correlation. However, when the correlation information is available, the weighting-capable statistical method, first introduced by Brown and recently modified by Hou, seems to perform the best amongst the methods investigated. PMID:24663491

  2. Effectiveness of DIAGNOdent in Detecting Root Caries Without Dental Scaling Among Community-dwelling Elderly.

    PubMed

    Zhang, Wen; McGrath, Colman; Lo, Edward C M

    The purpose of this clinical research was to analyze the effectiveness of DIAGNOdent in detecting root caries without dental scaling. The status of 750 exposed, unfilled root surfaces was assessed by visual-tactile examination and DIAGNOdent before and after root scaling. The sensitivity and specificity of different cut-off DIAGNOdent values in diagnosing root caries with reference to visual-tactile criteria were evaluated on those root surfaces without visible plaque/calculus. The DIAGNOdent values from sound and carious root surfaces were compared using the nonparametric Mann-Whitney U-test. The level of statistical significance was set at 0.05. On root surfaces without plaque/calculus, significantly different (p < 0.05) DIAGNOdent readings were obtained from sound root surfaces (12.2 ± 11.1), active carious root surfaces (37.6 ± 31.7) and inactive carious root surfaces (20.9 ± 10.5) before scaling. On root surfaces with visible plaque, DIAGNOdent readings obtained from active carious root surfaces (29.6 ± 20.8) and inactive carious root surfaces (27.0 ± 7.2) were not statistically significantly different (p > 0.05). Furthermore, on root surfaces with visible calculus, all DIAGNOdent readings obtained from sound root surfaces were > 50, which might be misinterpreted as carious. After scaling, the DIAGNOdent readings obtained from sound root surfaces (8.1 ± 11.3), active carious root surfaces (37.9 ± 31.9) and inactive carious root surfaces (24.9 ± 11.5) presented significant differences (p < 0.05). A cut-off value between 10 and 15 yielded the highest combined sensitivity and specificity in detecting root caries on root surfaces without visible plaque/calculus before scaling, but the combined sensitivity and specificity are both around 70%. These findings suggest that on exposed, unfilled root surfaces without visible plaque/calculus, DIAGNOdent can be used as an adjunct to the visual-tactile criteria in detecting root-surface status without pre-treatment by dental scaling.

  3. Monitoring Poisson observations using combined applications of Shewhart and EWMA charts

    NASA Astrophysics Data System (ADS)

    Abujiya, Mu'azu Ramat

    2017-11-01

    The Shewhart and exponentially weighted moving average (EWMA) charts for nonconformities are the most widely used procedures of choice for monitoring Poisson observations in modern industries. Individually, the Shewhart EWMA charts are only sensitive to large and small shifts, respectively. To enhance the detection abilities of the two schemes in monitoring all kinds of shifts in Poisson count data, this study examines the performance of combined applications of the Shewhart, and EWMA Poisson control charts. Furthermore, the study proposes modifications based on well-structured statistical data collection technique, ranked set sampling (RSS), to detect shifts in the mean of a Poisson process more quickly. The relative performance of the proposed Shewhart-EWMA Poisson location charts is evaluated in terms of the average run length (ARL), standard deviation of the run length (SDRL), median run length (MRL), average ratio ARL (ARARL), average extra quadratic loss (AEQL) and performance comparison index (PCI). Consequently, all the new Poisson control charts based on RSS method are generally more superior than most of the existing schemes for monitoring Poisson processes. The use of these combined Shewhart-EWMA Poisson charts is illustrated with an example to demonstrate the practical implementation of the design procedure.

  4. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  5. Prospective Comparison of 99mTc-MDP Scintigraphy, Combined 18F-NaF and 18F-FDG PET/CT, and Whole-Body MRI in Patients with Breast and Prostate Cancer.

    PubMed

    Minamimoto, Ryogo; Loening, Andreas; Jamali, Mehran; Barkhodari, Amir; Mosci, Camila; Jackson, Tatianie; Obara, Piotr; Taviani, Valentina; Gambhir, Sanjiv Sam; Vasanawala, Shreyas; Iagaru, Andrei

    2015-12-01

    We prospectively evaluated the use of combined (18)F-NaF/(18)F-FDG PET/CT in patients with breast and prostate cancer and compared the results with those for (99m)Tc-MDP bone scintigraphy and whole-body MRI. Thirty patients (15 women with breast cancer and 15 men with prostate cancer) referred for standard-of-care bone scintigraphy were prospectively enrolled in this study. (18)F-NaF/(18)F-FDG PET/CT and whole-body MRI were performed after bone scintigraphy. The whole-body MRI protocol consisted of both unenhanced and contrast-enhanced sequences. Lesions detected with each test were tabulated, and the results were compared. For extraskeletal lesions, (18)F-NaF/(18)F-FDG PET/CT and whole-body MRI had no statistically significant differences in sensitivity (92.9% vs. 92.9%, P = 1.00), positive predictive value (81.3% vs. 86.7%, P = 0.68), or accuracy (76.5% vs. 82.4%, P = 0.56). However, (18)F-NaF/(18)F-FDG PET/CT showed significantly higher sensitivity and accuracy than whole-body MRI (96.2% vs. 81.4%, P < 0.001, 89.8% vs. 74.7%, P = 0.01) and bone scintigraphy (96.2% vs. 64.6%, P < 0.001, 89.8% vs. 65.9%, P < 0.001) for the detection of skeletal lesions. Overall, (18)F-NaF/(18)F-FDG PET/CT showed higher sensitivity and accuracy than whole-body MRI (95.7% vs. 83.3%, P < 0.002, 87.6% vs. 76.0%, P < 0.02) but not statistically significantly so when compared with a combination of whole-body MRI and bone scintigraphy (95.7% vs. 91.6%, P = 0.17, 87.6% vs. 83.0%, P = 0.53). (18)F-NaF/(18)F-FDG PET/CT showed no significant difference from a combination of (18)F-NaF/(18)F-FDG PET/CT and whole-body MRI. No statistically significant differences in positive predictive value were noted among the 3 examinations. (18)F-NaF/(18)F-FDG PET/CT is superior to whole-body MRI and (99m)Tc-MDP scintigraphy for evaluation of skeletal disease extent. Further, (18)F-NaF/(18)F-FDG PET/CT and whole-body MRI detected extraskeletal disease that may change the management of these patients. (18)F-NaF/(18)F-FDG PET/CT provides diagnostic ability similar to that of a combination of whole-body MRI and bone scintigraphy in patients with breast and prostate cancer. Larger cohorts are needed to confirm these preliminary findings, ideally using the newly introduced simultaneous PET/MRI scanners. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  6. [Comparison of detection sensitivity in rapid-diagnosis influenza virus kits].

    PubMed

    Tokuno, Osamu; Fujiwara, Miki; Nakajoh, Yoshimi; Yamanouchi, Sumika; Adachi, Masayo; Ikeda, Akiko; Kitayama, Shigeo; Takahashi, Toshio; Kase, Tetsuo; Kinoshita, Shouhiro; Kumagai, Shunichi

    2009-09-01

    Rapid-diagnosis kits able to detect influenza A and B virus by immunochromatography developed by different manufacturers, while useful in early diagnosis, may vary widely in detection sensitivity. We compared sensitivity results for eight virus-detection kits in current use--Quick Chaser FluA, B (Mizuho Medy), Espline Influenza A & B-N (Fujirebio), Capilia Flu A + B (Nippon Beckton Dickinson & Alfesa Pharma), Poctem Influenza A/B (Otsuka Pharma & Sysmex), BD Flu Examan (Nippon Beckton Dickinson), Quick Ex-Flu "Seiken" (Denka Seiken), Quick Vue Rapid SP Influ (DP Pharma Biomedical), and Rapid Testa FLU stick (Daiichi Pure Chemicals)--against influenza virus stocks, contained five vaccination strains (one A/H1N1, two A/H3N2, and two B) and six clinical strains (two A/H1N1, two A/H3N2, and two B). Minimum detection concentrations giving immunologically positive signals in serial dilution and RNA copies in positive dilution in real-time reverse transcriptase-polymerase chain reaction (RT-PCR) were assayed for all kits and virus stock combinations. RNA log10 copy numbers/mL in dilutions within detection limits yielded 5.68-7.02, 6.37-7,17, and 6.5-8.13 for A/H1N1, A/H3N2, and B. Statistically significant differences in sensitivity were observed between some kit combinations. Detection sensitivity tended to be relatively higher for influenza A than B virus. This is assumed due to different principles in kit methods, such as monoclonal antibodies, specimen-extraction conditions, and other unknown factors.

  7. a Single-Exposure Dual-Energy Computed Radiography Technique for Improved Nodule Detection and Classification in Chest Imaging

    NASA Astrophysics Data System (ADS)

    Zink, Frank Edward

    The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology assessment.

  8. Comparison between Urine and Cervical Samples for HPV DNA Detection and Typing in Young Women in Colombia.

    PubMed

    Cómbita, Alba Lucía; Gheit, Tarik; González, Paula; Puerto, Devi; Murillo, Raúl Hernando; Montoya, Luisa; Vorsters, Alex; Van Keer, Severien; Van Damme, Pierre; Tommasino, Massimo; Hernández-Suárez, Gustavo; Sánchez, Laura; Herrero, Rolando; Wiesner, Carolina

    2016-09-01

    Urine sampling for HPV DNA detection has been proposed as an effective method for monitoring the impact of HPV vaccination programs; however, conflicting results have been reported. The goal of this study was to evaluate the performance of optimized urine HPV DNA testing in women aged 19 to 25 years. Optimization process included the use of first void urine, immediate mixing of urine with DNA preservative, and the concentration of all HPV DNA, including cell-free DNA fragments. Urine and cervical samples were collected from 535 young women attending cervical screening at health centers from two Colombian cities. HPV DNA detection and genotyping was performed using an HPV type-specific multiplex genotyping assay, which combines multiplex polymerase chain reaction with bead-based Luminex technology. Concordance between HPV DNA detection in urine and cervical samples was determined using kappa statistics and McNemar tests. The accuracy of HPV DNA testing in urine samples was evaluated measuring sensitivity and specificity using as reference the results obtained from cervical samples. Statistical analysis was performed using STATA11.2 software. The findings revealed an overall HPV prevalence of 60.00% in cervical samples and 64.72% in urine samples, HPV-16 being the most frequent HPV type detected in both specimens. Moreover, our results indicate that detection of HPV DNA in first void urine provides similar results to those obtained with cervical samples and can be used to monitor HPV vaccination trials and programs as evidenced by the substantial concordance found for the detection of the four vaccine types. Cancer Prev Res; 9(9); 766-71. ©2016 AACR. ©2016 American Association for Cancer Research.

  9. Hidden in the background: a local approach to CMB anomalies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sánchez, Juan C. Bueno, E-mail: juan.c.bueno@correounivalle.edu.co

    2016-09-01

    We investigate a framework aiming to provide a common origin for the large-angle anomalies detected in the Cosmic Microwave Background (CMB), which are hypothesized as the result of the statistical inhomogeneity developed by different isocurvature fields of mass m ∼ H present during inflation. The inhomogeneity arises as the combined effect of ( i ) the initial conditions for isocurvature fields (obtained after a fast-roll stage finishing many e -foldings before cosmological scales exit the horizon), ( ii ) their inflationary fluctuations and ( iii ) their coupling to other degrees of freedom. Our case of interest is when thesemore » fields (interpreted as the precursors of large-angle anomalies) leave an observable imprint only in isolated patches of the Universe. When the latter intersect the last scattering surface, such imprints arise in the CMB. Nevertheless, due to their statistically inhomogeneous nature, these imprints are difficult to detect, for they become hidden in the background similarly to the Cold Spot. We then compute the probability that a single isocurvature field becomes inhomogeneous at the end of inflation and find that, if the appropriate conditions are given (which depend exclusively on the preexisting fast-roll stage), this probability is at the percent level. Finally, we discuss several mechanisms (including the curvaton and the inhomogeneous reheating) to investigate whether an initial statistically inhomogeneous isocurvature field fluctuation might give rise to some of the observed anomalies. In particular, we focus on the Cold Spot, the power deficit at low multipoles and the breaking of statistical isotropy.« less

  10. The imprint of f(R) gravity on weak gravitational lensing - II. Information content in cosmic shear statistics

    NASA Astrophysics Data System (ADS)

    Shirasaki, Masato; Nishimichi, Takahiro; Li, Baojiu; Higuchi, Yuichi

    2017-04-01

    We investigate the information content of various cosmic shear statistics on the theory of gravity. Focusing on the Hu-Sawicki-type f(R) model, we perform a set of ray-tracing simulations and measure the convergence bispectrum, peak counts and Minkowski functionals. We first show that while the convergence power spectrum does have sensitivity to the current value of extra scalar degree of freedom |fR0|, it is largely compensated by a change in the present density amplitude parameter σ8 and the matter density parameter Ωm0. With accurate covariance matrices obtained from 1000 lensing simulations, we then examine the constraining power of the three additional statistics. We find that these probes are indeed helpful to break the parameter degeneracy, which cannot be resolved from the power spectrum alone. We show that especially the peak counts and Minkowski functionals have the potential to rigorously (marginally) detect the signature of modified gravity with the parameter |fR0| as small as 10-5 (10-6) if we can properly model them on small (˜1 arcmin) scale in a future survey with a sky coverage of 1500 deg2. We also show that the signal level is similar among the additional three statistics and all of them provide complementary information to the power spectrum. These findings indicate the importance of combining multiple probes beyond the standard power spectrum analysis to detect possible modifications to general relativity.

  11. Statistical Traffic Anomaly Detection in Time-Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  12. Statistical Traffic Anomaly Detection in Time Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  13. Anatomy of the Higgs fits: A first guide to statistical treatments of the theoretical uncertainties

    NASA Astrophysics Data System (ADS)

    Fichet, Sylvain; Moreau, Grégory

    2016-04-01

    The studies of the Higgs boson couplings based on the recent and upcoming LHC data open up a new window on physics beyond the Standard Model. In this paper, we propose a statistical guide to the consistent treatment of the theoretical uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist approaches are systematically analysed in a unified formalism. We present analytical expressions for the marginal likelihoods, useful to implement simultaneously the experimental and theoretical uncertainties. We review the various origins of the theoretical errors (QCD, EFT, PDF, production mode contamination…). All these individual uncertainties are thoroughly combined with the help of moment-based considerations. The theoretical correlations among Higgs detection channels appear to affect the location and size of the best-fit regions in the space of Higgs couplings. We discuss the recurrent question of the shape of the prior distributions for the individual theoretical errors and find that a nearly Gaussian prior arises from the error combinations. We also develop the bias approach, which is an alternative to marginalisation providing more conservative results. The statistical framework to apply the bias principle is introduced and two realisations of the bias are proposed. Finally, depending on the statistical treatment, the Standard Model prediction for the Higgs signal strengths is found to lie within either the 68% or 95% confidence level region obtained from the latest analyses of the 7 and 8 TeV LHC datasets.

  14. Clinical value of combining transvaginal contrast-enhanced ultrasonography with serum human epididymisprotein-4 and the resistance index for early-stage epithelial ovarian cancer

    PubMed Central

    Meng, Wu; Ying, Wang; Qichao, Zheng; Ping, Li; Jie, Tang

    2017-01-01

    Objectives: To increase accuracy of the detection and differential diagnosis of the early epithelial ovarian cancer (EOC) with transvaginal contrast-enhanced ultrasonography (TVCEUS) combining serum human epididymisprotein 4 (HE4), and resistance index (RI). Methods: This retrospectively case-control study of 230 patients with ovarian tumors were reviewed at the Department of Gynecology and Obstetrics, Zhongnan Hospital, Wuhan University, Wuhan, China between June 2008 and September 2015. Before the operation of 110 cases with EOC (Group A) and 120 cases of patients with benign ovarian tumor (Group B), we observe and calculate both Groups’ tumor vascular contrast-enhanced ultrasonography morphology scores (U), time-intensity curve (TIC) of contrast-enhanced ultrasonography, HE4, and RI. Results were compared with the histopathological analysis results. Results: The ultrasonography morphology scores, peak intensity (PI) enhancement rate (ER) with the parameters of the TIC and HE4 are higher in Group A compared with patients in Group B and the RI was lower than Group B. The detection rates for all indexes in the benign and malignant groups and their comparisons to the histopathological results were determined. The detection rate differences for HE4 (p=0.001), RI (p=0.001), U (p=0.001), PI (p=0.001), and ER (p=0.001) were all statistically significant (p<0.05). Conclusion: The high clinical value through combined TVCEUS, HE4, and RI detection can increase the sensitivity of the diagnosis and differential diagnosis of the early EOC. PMID:28578437

  15. Combing signals from spontaneous reports and electronic health records for detection of adverse drug reactions

    PubMed Central

    Harpaz, Rave; Vilar, Santiago; DuMouchel, William; Salmasian, Hojjat; Haerian, Krystl; Shah, Nigam H; Chase, Herbert S; Friedman, Carol

    2013-01-01

    Objective Data-mining algorithms that can produce accurate signals of potentially novel adverse drug reactions (ADRs) are a central component of pharmacovigilance. We propose a signal-detection strategy that combines the adverse event reporting system (AERS) of the Food and Drug Administration and electronic health records (EHRs) by requiring signaling in both sources. We claim that this approach leads to improved accuracy of signal detection when the goal is to produce a highly selective ranked set of candidate ADRs. Materials and methods Our investigation was based on over 4 million AERS reports and information extracted from 1.2 million EHR narratives. Well-established methodologies were used to generate signals from each source. The study focused on ADRs related to three high-profile serious adverse reactions. A reference standard of over 600 established and plausible ADRs was created and used to evaluate the proposed approach against a comparator. Results The combined signaling system achieved a statistically significant large improvement over AERS (baseline) in the precision of top ranked signals. The average improvement ranged from 31% to almost threefold for different evaluation categories. Using this system, we identified a new association between the agent, rasburicase, and the adverse event, acute pancreatitis, which was supported by clinical review. Conclusions The results provide promising initial evidence that combining AERS with EHRs via the framework of replicated signaling can improve the accuracy of signal detection for certain operating scenarios. The use of additional EHR data is required to further evaluate the capacity and limits of this system and to extend the generalizability of these results. PMID:23118093

  16. A mathematical model approach toward combining information from multiple image projections of the same patient

    NASA Astrophysics Data System (ADS)

    Chawla, Amarpreet S.; Samei, Ehsan; Abbey, Craig

    2007-03-01

    In this study, we used a mathematical observer model to combine information obtained from multiple angular projections of the same breast to determine the overall detection performance of a multi-projection breast imaging system in detectability of a simulated mass. 82 subjects participated in the study and 25 angular projections of each breast were acquired. Projections from a simulated 3 mm 3-D lesion were added to the projection images. The lesion was assumed to be embedded in the compressed breast at a distance of 3 cm from the detector. Hotelling observer with Laguerre-Gauss channels (LG CHO) was applied to each image. Detectability was analyzed in terms of ROC curves and the area under ROC curves (AUC). The critical question studied is how to best integrate the individual decision variables across multiple (correlated) views. Towards that end, three different methods were investigated. Specifically, 1) ROCs from different projections were simply averaged; 2) the test statistics from different projections were averaged; and 3) a Bayesian decision fusion rule was used. Finally, AUC of the combined ROC was used as a parameter to optimize the acquisition parameters to maximize the performance of the system. It was found that the Bayesian decision fusion technique performs better than the other two techniques and likely offers the best approximation of the diagnostic process. Furthermore, if the total dose level is held constant at 1/25th of dual-view mammographic screening dose, the highest detectability performance is observed when considering only two projections spread along an angular span of 11.4°.

  17. [The application of the prospective space-time statistic in early warning of infectious disease].

    PubMed

    Yin, Fei; Li, Xiao-Song; Feng, Zi-Jian; Ma, Jia-Qi

    2007-06-01

    To investigate the application of prospective space-time scan statistic in the early stage of detecting infectious disease outbreaks. The prospective space-time scan statistic was tested by mimicking daily prospective analyses of bacillary dysentery data of Chengdu city in 2005 (3212 cases in 102 towns and villages). And the results were compared with that of purely temporal scan statistic. The prospective space-time scan statistic could give specific messages both in spatial and temporal. The results of June indicated that the prospective space-time scan statistic could timely detect the outbreaks that started from the local site, and the early warning message was powerful (P = 0.007). When the merely temporal scan statistic for detecting the outbreak was sent two days later, and the signal was less powerful (P = 0.039). The prospective space-time scan statistic could make full use of the spatial and temporal information in infectious disease data and could timely and effectively detect the outbreaks that start from the local sites. The prospective space-time scan statistic could be an important tool for local and national CDC to set up early detection surveillance systems.

  18. Multi-scales region segmentation for ROI separation in digital mammograms

    NASA Astrophysics Data System (ADS)

    Zhang, Dapeng; Zhang, Di; Li, Yue; Wang, Wei

    2017-02-01

    Mammography is currently the most effective imaging modality used by radiologists for the screening of breast cancer. Segmentation is one of the key steps in the process of developing anatomical models for calculation of safe medical dose of radiation. This paper explores the potential of the statistical region merging segmentation technique for Breast segmentation in digital mammograms. First, the mammograms are pre-processing for regions enhancement, then the enhanced images are segmented using SRM with multi scales, finally these segmentations are combined for region of interest (ROI) separation and edge detection. The proposed algorithm uses multi-scales region segmentation in order to: separate breast region from background region, region edge detection and ROIs separation. The experiments are performed using a data set of mammograms from different patients, demonstrating the validity of the proposed criterion. Results show that, the statistical region merging segmentation algorithm actually can work on the segmentation of medical image and more accurate than another methods. And the outcome shows that the technique has a great potential to become a method of choice for segmentation of mammograms.

  19. Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy

    PubMed Central

    2011-01-01

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685

  20. Automated oil spill detection with multispectral imagery

    NASA Astrophysics Data System (ADS)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  1. Analysis of a Smartphone-Based Architecture with Multiple Mobility Sensors for Fall Detection with Supervised Learning

    PubMed Central

    Santoyo-Ramón, José Antonio

    2018-01-01

    This paper describes a wearable Fall Detection System (FDS) based on a body-area network consisting of four nodes provided with inertial sensors and Bluetooth wireless interfaces. The signals captured by the nodes are sent to a smartphone which simultaneously acts as another sensing point. In contrast to many FDSs proposed by the literature (which only consider a single sensor), the multisensory nature of the prototype is utilized to investigate the impact of the number and the positions of the sensors on the effectiveness of the production of the fall detection decision. In particular, the study assesses the capability of four popular machine learning algorithms to discriminate the dynamics of the Activities of Daily Living (ADLs) and falls generated by a set of experimental subjects, when the combined use of the sensors located on different parts of the body is considered. Prior to this, the election of the statistics that optimize the characterization of the acceleration signals and the efficacy of the FDS is also investigated. As another important methodological novelty in this field, the statistical significance of all the results (an aspect which is usually neglected by other works) is validated by an analysis of variance (ANOVA). PMID:29642638

  2. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    NASA Astrophysics Data System (ADS)

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  3. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging.

    PubMed

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-21

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  4. Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.

    PubMed

    Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi

    2013-12-01

    Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.

  5. Learning-dependent plasticity with and without training in the human brain.

    PubMed

    Zhang, Jiaxiang; Kourtzi, Zoe

    2010-07-27

    Long-term experience through development and evolution and shorter-term training in adulthood have both been suggested to contribute to the optimization of visual functions that mediate our ability to interpret complex scenes. However, the brain plasticity mechanisms that mediate the detection of objects in cluttered scenes remain largely unknown. Here, we combine behavioral and functional MRI (fMRI) measurements to investigate the human-brain mechanisms that mediate our ability to learn statistical regularities and detect targets in clutter. We show two different routes to visual learning in clutter with discrete brain plasticity signatures. Specifically, opportunistic learning of regularities typical in natural contours (i.e., collinearity) can occur simply through frequent exposure, generalize across untrained stimulus features, and shape processing in occipitotemporal regions implicated in the representation of global forms. In contrast, learning to integrate discontinuities (i.e., elements orthogonal to contour paths) requires task-specific training (bootstrap-based learning), is stimulus-dependent, and enhances processing in intraparietal regions implicated in attention-gated learning. We propose that long-term experience with statistical regularities may facilitate opportunistic learning of collinear contours, whereas learning to integrate discontinuities entails bootstrap-based training for the detection of contours in clutter. These findings provide insights in understanding how long-term experience and short-term training interact to shape the optimization of visual recognition processes.

  6. Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.

    PubMed

    Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N

    2011-04-15

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.

  7. Sequential Monte Carlo tracking of the marginal artery by multiple cue fusion and random forest regression.

    PubMed

    Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M

    2015-01-01

    Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.

  8. Quality classification of Spanish olive oils by untargeted gas chromatography coupled to hybrid quadrupole-time of flight mass spectrometry with atmospheric pressure chemical ionization and metabolomics-based statistical approach.

    PubMed

    Sales, C; Cervera, M I; Gil, R; Portolés, T; Pitarch, E; Beltran, J

    2017-02-01

    The novel atmospheric pressure chemical ionization (APCI) source has been used in combination with gas chromatography (GC) coupled to hybrid quadrupole time-of-flight (QTOF) mass spectrometry (MS) for determination of volatile components of olive oil, enhancing its potential for classification of olive oil samples according to their quality using a metabolomics-based approach. The full-spectrum acquisition has allowed the detection of volatile organic compounds (VOCs) in olive oil samples, including Extra Virgin, Virgin and Lampante qualities. A dynamic headspace extraction with cartridge solvent elution was applied. The metabolomics strategy consisted of three different steps: a full mass spectral alignment of GC-MS data using MzMine 2.0, a multivariate analysis using Ez-Info and the creation of the statistical model with combinations of responses for molecular fragments. The model was finally validated using blind samples, obtaining an accuracy in oil classification of 70%, taking the official established method, "PANEL TEST", as reference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Predictive criteria for prostate cancer detection in men with serum PSA concentration of 2.0 to 4.0 ng/mL.

    PubMed

    Kravchick, Sergey; Peled, Ronit; Dorfman, Dov; Agulansky, Leonid; Ben-Dor, David; Cytron, Shmuel

    2005-09-01

    To assess the usefulness of measuring testosterone, free testosterone, and the free/total (f/t) prostate-specific antigen (PSA) ratio with the intention of reducing the number of unnecessary biopsies in the patients with PSA values between 2.0 and 4.0 ng/mL. Cancer detection is not rare among patients with PSA values between 2.0 and 4.0 ng/mL. A total of 171 men with serum PSA levels of 2.0 to 4.0 ng/mL were enrolled in this study. The f/t PSA ratio and total and free testosterone levels were quantified. All patients underwent transrectal ultrasound-guided biopsy. The cancer detection rate, clinical and pathologic features of the cancers detected, and the probability of cancer detection in relation to the f/t PSA ratio and total and free testosterone levels were estimated. Two-step statistical analysis was used for descriptive purposes and in the detection of cancer predictors. Statistical significance was set at P < or = 0.05. The mean patient age was 63.3 years. Cancer was detected in 39 (22.8%) of the 171 patients. Only 15.4% of our patients had insignificant cancer. The f/t PSA ratio and total and free testosterone levels were significantly lower in the patients with prostate cancer (19.3%, 13.68 nmol/L, and 28.4 pmol/L, respectively; P < 0.001). The f/t PSA ratio and free testosterone were the strongest predictors of cancer detection (P < 0.001). The results of our study have shown that an important number of cancers could be detected in the PSA range of 2.0 to 4.0 ng/mL. The great majority of cancers detected have the features of medically significant tumors. The combination of the f/t PSA ratio and free testosterone measurements may reveal those patients who require biopsy.

  10. The effect of swab sample choice on the detection of avian influenza in apparently healthy wild ducks

    USGS Publications Warehouse

    Ip, Hon S.; Dusek, Robert J.; Heisey, Dennis M.

    2012-01-01

    Historically, avian influenza viruses have been isolated from cloacal swab specimens, but recent data suggest that the highly pathogenic avian influenza (HPAI) H5N1 virus can be better detected from respiratory tract specimens. To better understand how swab sample type affects the detection ability of low pathogenic avian influenza (LPAI) viruses we collected and tested four swab types: oropharyngeal swabs (OS), cloacal swabs (CS), the two swab types combined in the laboratory (LCS), and the two swab types combined in the field (FCS). A total of 1968 wild waterfowl were sampled by each of these four methods and tested for avian influenza virus using matrix gene reverse-transcription (RT)-PCR. The highest detection rate occurred with the FCS (4.3%) followed by the CS (4.0%). Although this difference did not achieve traditional statistical significance, Bayesian analysis indicated that FCS was superior to CS with an 82% probability. The detection rates for both the LCS (2.4%) and the OS (0.4%) were significantly different from the FCS. In addition, every swab type that was matrix RT-PCR positive was also tested for recovery of viable influenza virus. This protocol reduced the detection rate, but the ordering of swab types remained the same: 1.73% FCS, 1.42% CS, 0.81% LCS, and 0% OS. Our data suggest that the FCS performed at least as well as any other swab type for detecting LPAI viruses in the wild ducks tested. When considering recent studies showing that HPAI H5N1 can be better detected in the respiratory tract, the FCS is the most appropriate sample to collect for HPAI H5N1 surveillance while not compromising LPAI studies.

  11. Are concurrent systematic cores needed at the time of targeted biopsy in patients with prior negative prostate biopsies?

    PubMed

    Albisinni, S; Aoun, F; Noel, A; El Rassy, E; Lemort, M; Paesmans, M; van Velthoven, R; Roumeguère, T; Peltier, A

    2018-01-01

    MRI-guided targeted biopsies are advised in patients who have undergone an initial series of negative systematic biopsies, in whom prostate cancer (PCa) suspicion remains elevated. The aim of the study was to evaluate whether, in men with prior negative prostate biopsies, systematic cores are also warranted at the time of an MRI-targeted repeat biopsy. We enrolled patients with prior negative biopsy undergoing real time MRI/TRUS fusion guided prostate biopsy at our institute between 2014 and 2016. Patients with at least one index lesion on multiparametric MRI were included. All eligible patients underwent both systematic random biopsies (12-14 cores) and targeted biopsies (2-4 cores). The study included 74 men with a median age of 65 years, PSA level of 9.27ng/mL, and prostatic volume of 45ml. The overall PCa detection rate and the clinically significant cancer detection rate were 56.7% and 39.2%, respectively. Targeted cores demonstrated similar clinically significant PCa detection rate compared to systematic cores (33.8% vs. 28.4%, P=0.38) with significantly less tissue sampling. Indeed, a combination approach was significantly superior to a targeted-only in overall PCa detection (+16.7% overall detection rate, P=0.007). Although differences in clinically significant PCa detection were statistically non-significant (P=0.13), a combination approach did allow detecting 7 extra clinically significant PCas (+13.8%). In patients with elevated PSA and prior negative biopsies, concurrent systematic sampling may be needed at the time of targeted biopsy in order to maximize PCa detection rate. Larger studies are needed to validate our findings. 4. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  12. SVM based colon polyps classifier in a wireless active stereo endoscope.

    PubMed

    Ayoub, J; Granado, B; Mhanna, Y; Romain, O

    2010-01-01

    This work focuses on the recognition of three-dimensional colon polyps captured by an active stereo vision sensor. The detection algorithm consists of SVM classifier trained on robust feature descriptors. The study is related to Cyclope, this prototype sensor allows real time 3D object reconstruction and continues to be optimized technically to improve its classification task by differentiation between hyperplastic and adenomatous polyps. Experimental results were encouraging and show correct classification rate of approximately 97%. The work contains detailed statistics about the detection rate and the computing complexity. Inspired by intensity histogram, the work shows a new approach that extracts a set of features based on depth histogram and combines stereo measurement with SVM classifiers to correctly classify benign and malignant polyps.

  13. Experimental design and statistical analysis for three-drug combination studies.

    PubMed

    Fang, Hong-Bin; Chen, Xuerong; Pei, Xin-Yan; Grant, Steven; Tan, Ming

    2017-06-01

    Drug combination is a critically important therapeutic approach for complex diseases such as cancer and HIV due to its potential for efficacy at lower, less toxic doses and the need to move new therapies rapidly into clinical trials. One of the key issues is to identify which combinations are additive, synergistic, or antagonistic. While the value of multidrug combinations has been well recognized in the cancer research community, to our best knowledge, all existing experimental studies rely on fixing the dose of one drug to reduce the dimensionality, e.g. looking at pairwise two-drug combinations, a suboptimal design. Hence, there is an urgent need to develop experimental design and analysis methods for studying multidrug combinations directly. Because the complexity of the problem increases exponentially with the number of constituent drugs, there has been little progress in the development of methods for the design and analysis of high-dimensional drug combinations. In fact, contrary to common mathematical reasoning, the case of three-drug combinations is fundamentally more difficult than two-drug combinations. Apparently, finding doses of the combination, number of combinations, and replicates needed to detect departures from additivity depends on dose-response shapes of individual constituent drugs. Thus, different classes of drugs of different dose-response shapes need to be treated as a separate case. Our application and case studies develop dose finding and sample size method for detecting departures from additivity with several common (linear and log-linear) classes of single dose-response curves. Furthermore, utilizing the geometric features of the interaction index, we propose a nonparametric model to estimate the interaction index surface by B-spine approximation and derive its asymptotic properties. Utilizing the method, we designed and analyzed a combination study of three anticancer drugs, PD184, HA14-1, and CEP3891 inhibiting myeloma H929 cell line. To our best knowledge, this is the first ever three drug combinations study performed based on the original 4D dose-response surface formed by dose ranges of three drugs.

  14. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    PubMed

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  15. Vehicle track segmentation using higher order random fields

    DOE PAGES

    Quach, Tu -Thach

    2017-01-09

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  16. Vehicle track segmentation using higher order random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quach, Tu -Thach

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  17. Mathematical Model of Cardiovascular and Metabolic Responses to Umbilical Cord Occlusions in Fetal Sheep.

    PubMed

    Wang, Qiming; Gold, Nathan; Frasch, Martin G; Huang, Huaxiong; Thiriet, Marc; Wang, Xiaogang

    2015-12-01

    Fetal acidemia during labor is associated with an increased risk of brain injury and lasting neurological deficits. This is in part due to the repetitive occlusions of the umbilical cord (UCO) induced by uterine contractions. Whereas fetal heart rate (FHR) monitoring is widely used clinically, it fails to detect fetal acidemia. Hence, new approaches are needed for early detection of fetal acidemia during labor. We built a mathematical model of the UCO effects on FHR, mean arterial blood pressure (MABP), oxygenation and metabolism. Mimicking fetal experiments, our in silico model reproduces salient features of experimentally observed fetal cardiovascular and metabolic behavior including FHR overshoot, gradual MABP decrease and mixed metabolic and respiratory acidemia during UCO. Combined with statistical analysis, our model provides valuable insight into the labor-like fetal distress and guidance for refining FHR monitoring algorithms to improve detection of fetal acidemia and cardiovascular decompensation.

  18. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, Stephen; Heaney, Michael; Jin, Xin

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energymore » models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.« less

  19. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, Stephen; Heaney, Michael; Jin, Xin

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energymore » models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.« less

  20. Hazard avoidance via descent images for safe landing

    NASA Astrophysics Data System (ADS)

    Yan, Ruicheng; Cao, Zhiguo; Zhu, Lei; Fang, Zhiwen

    2013-10-01

    In planetary or lunar landing missions, hazard avoidance is critical for landing safety. Therefore, it is very important to correctly detect hazards and effectively find a safe landing area during the last stage of descent. In this paper, we propose a passive sensing based HDA (hazard detection and avoidance) approach via descent images to lower the landing risk. In hazard detection stage, a statistical probability model on the basis of the hazard similarity is adopted to evaluate the image and detect hazardous areas, so that a binary hazard image can be generated. Afterwards, a safety coefficient, which jointly utilized the proportion of hazards in the local region and the inside hazard distribution, is proposed to find potential regions with less hazards in the binary hazard image. By using the safety coefficient in a coarse-to-fine procedure and combining it with the local ISD (intensity standard deviation) measure, the safe landing area is determined. The algorithm is evaluated and verified with many simulated descent downward looking images rendered from lunar orbital satellite images.

  1. Identifying WIMP dark matter from particle and astroparticle data

    NASA Astrophysics Data System (ADS)

    Bertone, Gianfranco; Bozorgnia, Nassim; Kim, Jong Soo; Liem, Sebastian; McCabe, Christopher; Otten, Sydney; Ruiz de Austri, Roberto

    2018-03-01

    One of the most promising strategies to identify the nature of dark matter consists in the search for new particles at accelerators and with so-called direct detection experiments. Working within the framework of simplified models, and making use of machine learning tools to speed up statistical inference, we address the question of what we can learn about dark matter from a detection at the LHC and a forthcoming direct detection experiment. We show that with a combination of accelerator and direct detection data, it is possible to identify newly discovered particles as dark matter, by reconstructing their relic density assuming they are weakly interacting massive particles (WIMPs) thermally produced in the early Universe, and demonstrating that it is consistent with the measured dark matter abundance. An inconsistency between these two quantities would instead point either towards additional physics in the dark sector, or towards a non-standard cosmology, with a thermal history substantially different from that of the standard cosmological model.

  2. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    ERIC Educational Resources Information Center

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  3. Lower reference limits of quantitative cord glucose-6-phosphate dehydrogenase estimated from healthy term neonates according to the clinical and laboratory standards institute guidelines: a cross sectional retrospective study

    PubMed Central

    2013-01-01

    Background Previous studies have reported the lower reference limit (LRL) of quantitative cord glucose-6-phosphate dehydrogenase (G6PD), but they have not used approved international statistical methodology. Using common standards is expecting to yield more true findings. Therefore, we aimed to estimate LRL of quantitative G6PD detection in healthy term neonates by using statistical analyses endorsed by the International Federation of Clinical Chemistry (IFCC) and the Clinical and Laboratory Standards Institute (CLSI) for reference interval estimation. Methods This cross sectional retrospective study was performed at King Abdulaziz Hospital, Saudi Arabia, between March 2010 and June 2012. The study monitored consecutive neonates born to mothers from one Arab Muslim tribe that was assumed to have a low prevalence of G6PD-deficiency. Neonates that satisfied the following criteria were included: full-term birth (37 weeks); no admission to the special care nursery; no phototherapy treatment; negative direct antiglobulin test; and fathers of female neonates were from the same mothers’ tribe. The G6PD activity (Units/gram Hemoglobin) was measured spectrophotometrically by an automated kit. This study used statistical analyses endorsed by IFCC and CLSI for reference interval estimation. The 2.5th percentiles and the corresponding 95% confidence intervals (CI) were estimated as LRLs, both in presence and absence of outliers. Results 207 males and 188 females term neonates who had cord blood quantitative G6PD testing met the inclusion criteria. Method of Horn detected 20 G6PD values as outliers (8 males and 12 females). Distributions of quantitative cord G6PD values exhibited a normal distribution in absence of the outliers only. The Harris-Boyd method and proportion criteria revealed that combined gender LRLs were reliable. The combined bootstrap LRL in presence of the outliers was 10.0 (95% CI: 7.5-10.7) and the combined parametric LRL in absence of the outliers was 11.0 (95% CI: 10.5-11.3). Conclusion These results contribute to the LRL of quantitative cord G6PD detection in full-term neonates. They are transferable to another laboratory when pre-analytical factors and testing methods are comparable and the IFCC-CLSI requirements of transference are satisfied. We are suggesting using estimated LRL in absence of the outliers as mislabeling G6PD-deficient neonates as normal is intolerable whereas mislabeling G6PD-normal neonates as deficient is tolerable. PMID:24016342

  4. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.

  5. Accuracy of combined dynamic contrast-enhanced magnetic resonance imaging and diffusion-weighted imaging for breast cancer detection: a meta-analysis.

    PubMed

    Zhang, Li; Tang, Min; Min, Zhiqian; Lu, Jun; Lei, Xiaoyan; Zhang, Xiaoling

    2016-06-01

    Magnetic resonance imaging (MRI) is increasingly being used to examine patients with suspected breast cancer. To determine the diagnostic performance of combined dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) and diffusion-weighted imaging (DWI) for breast cancer detection. A comprehensive search of the PUBMED, EMBASE, Web of Science, and Cochrane Library databases was performed up to September 2014. Statistical analysis included pooling of sensitivity and specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), and diagnostic accuracy using the summary receiver operating characteristic (SROC). All analyses were conducted using STATA (version 12.0), RevMan (version 5.2), and Meta-Disc 1.4 software programs. Fourteen studies were analyzed, which included a total of 1140 patients with 1276 breast lesions. The pooled sensitivity and specificity of combined DCE-MRI and DWI were 91.6% and 85.5%, respectively. The pooled sensitivity and specificity of DWI-MRI were 86.0% and 75.6%, respectively. The pooled sensitivity and specificity of DCE-MRI were 93.2% and 71.1%. The area under the SROC curve (AUC-SROC) of combined DCE-MRI and DWI was 0.94, the DCE-MRI of 0.85. Deeks testing confirmed no significant publication bias in all studies. Combined DCE-MRI and DWI had superior diagnostic accuracy than either DCE-MRI or DWI alone for the diagnosis of breast cancer. © The Foundation Acta Radiologica 2015.

  6. Hepatic MR imaging for in vivo differentiation of steatosis, iron deposition and combined storage disorder: single-ratio in/opposed phase analysis vs. dual-ratio Dixon discrimination.

    PubMed

    Bashir, Mustafa R; Merkle, Elmar M; Smith, Alastair D; Boll, Daniel T

    2012-02-01

    To assess whether in vivo dual-ratio Dixon discrimination can improve detection of diffuse liver disease, specifically steatosis, iron deposition and combined disease over traditional single-ratio in/opposed phase analysis. Seventy-one patients with biopsy-proven (17.7 ± 17.0 days) hepatic steatosis (n = 16), iron deposition (n = 11), combined deposition (n = 3) and neither disease (n = 41) underwent MR examinations. Dual-echo in/opposed-phase MR with Dixon water/fat reconstructions were acquired. Analysis consisted of: (a) single-ratio hepatic region-of-interest (ROI)-based assessment of in/opposed ratios; (b) dual-ratio hepatic ROI assessment of in/opposed and fat/water ratios; (c) computer-aided dual-ratio assessment evaluating all hepatic voxels. Disease-specific thresholds were determined; statistical analyses assessed disease-dependent voxel ratios, based on single-ratio (a) and dual-ratio (b and c) techniques. Single-ratio discrimination succeeded in identifying iron deposition (I/O(Ironthreshold)<0.88) and steatosis (I/O(Fatthreshold>1.15)) from normal parenchyma, sensitivity 70.0%; it failed to detect combined disease. Dual-ratio discrimination succeeded in identifying abnormal hepatic parenchyma (F/W(Normalthreshold)>0.05), sensitivity 96.7%; logarithmic functions for iron deposition (I/O(Irondiscriminator)e((F/W(Fat)-0.01)/0.48)) differentiated combined from isolated diseases, sensitivity 100.0%; computer-aided dual-ratio analysis was comparably sensitive but less specific, 90.2% vs. 97.6%. MR two-point-Dixon imaging using dual-ratio post-processing based on in/opposed and fat/water ratios improved in vivo detection of hepatic steatosis, iron deposition, and combined storage disease beyond traditional in/opposed analysis. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Detection of questionable occlusal carious lesions using an electrical bioimpedance method with fractional electrical model

    NASA Astrophysics Data System (ADS)

    Morais, A. P.; Pino, A. V.; Souza, M. N.

    2016-08-01

    This in vitro study evaluated the diagnostic performance of an alternative electric bioimpedance spectroscopy technique (BIS-STEP) detect questionable occlusal carious lesions. Six specialists carried out the visual (V), radiography (R), and combined (VR) exams of 57 sound or non-cavitated occlusal carious lesion teeth classifying the occlusal surfaces in sound surface (H), enamel caries (EC), and dentinal caries (DC). Measurements were based on the current response to a step voltage excitation (BIS-STEP). A fractional electrical model was used to predict the current response in the time domain and to estimate the model parameters: Rs and Rp (resistive parameters), and C and α (fractional parameters). Histological analysis showed caries prevalence of 33.3% being 15.8% hidden caries. Combined examination obtained the best traditional diagnostic results with specificity = 59.0%, sensitivity = 70.9%, and accuracy = 60.8%. There were statistically significant differences in bioimpedance parameters between the H and EC groups (p = 0.016) and between the H and DC groups (Rs, p = 0.006; Rp, p = 0.022, and α, p = 0.041). Using a suitable threshold for the Rs, we obtained specificity = 60.7%, sensitivity = 77.9%, accuracy = 73.2%, and 100% of detection for deep lesions. It can be concluded that BIS-STEP method could be an important tool to improve the detection and management of occlusal non-cavitated primary caries and pigmented sites.

  8. A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko

    2012-12-30

    Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Comparison Between Various Color Spectra and Conventional Grayscale Imaging for Detection of Parenchymal Liver Lesions With B-Mode Sonography.

    PubMed

    Merkel, Daniel; Brinkmann, Eckard; Kämmer, Joerg C; Köhler, Miriam; Wiens, Daniel; Derwahl, Karl-Michael

    2015-09-01

    The electronic colorization of grayscale B-mode sonograms using various color schemes aims to enhance the adaptability and practicability of B-mode sonography in daylight conditions. The purpose of this study was to determine the diagnostic effectiveness and importance of colorized B-mode sonography. Fifty-three video sequences of sonographic examinations of the liver were digitized and subsequently colorized in 2 different color combinations (yellow-brown and blue-white). The set of 53 images consisted of 33 with isoechoic masses, 8 with obvious lesions of the liver (hypoechoic or hyperechoic), and 12 with inconspicuous reference images of the liver. The video sequences were combined in a random order and edited into half-hour video clips. Isoechoic liver lesions were successfully detected in 58% of the yellow-brown video sequences and in 57% of the grayscale video sequences (P = .74, not significant). Fifty percent of the isoechoic liver lesions were successfully detected in the blue-white video sequences, as opposed to a 55% detection rate in the corresponding grayscale video sequences (P= .11, not significant). In 2 subgroups, significantly more liver lesions were detected with grayscale sonography compared to blue-white sonography. Yellow-brown-colorized B-mode sonography appears to be similarly effective for detection of isoechoic parenchymal liver lesions as traditional grayscale sonography. Blue-white colorization in B-mode sonography is probably not as effective as grayscale sonography, although a statistically significant disadvantage was shown only in the subgroup of hyperechoic liver lesions. © 2015 by the American Institute of Ultrasound in Medicine.

  10. Development of a novel diagnostic algorithm to predict NASH in HCV-positive patients.

    PubMed

    Gallotta, Andrea; Paneghetti, Laura; Mrázová, Viera; Bednárová, Adriana; Kružlicová, Dáša; Frecer, Vladimir; Miertus, Stanislav; Biasiolo, Alessandra; Martini, Andrea; Pontisso, Patrizia; Fassina, Giorgio

    2018-05-01

    Non-alcoholic steato-hepatitis (NASH) is a severe disease characterised by liver inflammation and progressive hepatic fibrosis, which may progress to cirrhosis and hepatocellular carcinoma. Clinical evidence suggests that in hepatitis C virus patients steatosis and NASH are associated with faster fibrosis progression and hepatocellular carcinoma. A safe and reliable non-invasive diagnostic method to detect NASH at its early stages is still needed to prevent progression of the disease. We prospectively enrolled 91 hepatitis C virus-positive patients with histologically proven chronic liver disease: 77 patients were included in our study; of these, 10 had NASH. For each patient, various clinical and serological variables were collected. Different algorithms combining squamous cell carcinoma antigen-immunoglobulin-M (SCCA-IgM) levels with other common clinical data were created to provide the probability of having NASH. Our analysis revealed a statistically significant correlation between the histological presence of NASH and SCCA-IgM, insulin, homeostasis model assessment, haemoglobin, high-density lipoprotein and ferritin levels, and smoke. Compared to the use of a single marker, algorithms that combined four, six or seven variables identified NASH with higher accuracy. The best diagnostic performance was obtained with the logistic regression combination, which included all seven variables correlated with NASH. The combination of SCCA-IgM with common clinical data shows promising diagnostic performance for the detection of NASH in hepatitis C virus patients.

  11. A systematic Chandra study of Sgr A⋆ - I. X-ray flare detection

    NASA Astrophysics Data System (ADS)

    Yuan, Qiang; Wang, Q. Daniel

    2016-02-01

    Daily X-ray flaring represents an enigmatic phenomenon of Sagittarius A⋆ (Sgr A⋆) - the supermassive black hole at the centre of our Galaxy. We report initial results from a systematic X-ray study of this phenomenon, based on extensive Chandra observations obtained from 1999 to 2012, totalling about 4.5 Ms. We detect flares, using a combination of the maximum likelihood and Markov Chain Monte Carlo methods, which allow for a direct accounting for the pileup effect in the modelling of the flare light curves and an optimal use of the data, as well as the measurements of flare parameters, including their uncertainties. A total of 82 flares are detected. About one third of them are relatively faint, which were not detected previously. The observation-to-observation variation of the quiescent emission has an average root-mean-square of 6-14 per cent, including the Poisson statistical fluctuation of faint flares below our detection limits. We find no significant long-term variation in the quiescent emission and the flare rate over the 14 years. In particular, we see no evidence of changing quiescent emission and flare rate around the pericentre passage of the S2 star around 2002. We show clear evidence of a short-term clustering for the Advanced CCD Imaging Spectrometer - Spectroscopy array/high energy transmission gratings 0th-order flares on time-scale of 20-70 ks. We further conduct detailed simulations to characterize the detection incompleteness and bias, which is critical to a comprehensive follow-up statistical analysis of flare properties. These studies together will help to establish Sgr A⋆ as a unique laboratory to understand the astrophysics of prevailing low-luminosity black holes in the Universe.

  12. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    NASA Astrophysics Data System (ADS)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  13. Detecting measurement outliers: remeasure efficiently

    NASA Astrophysics Data System (ADS)

    Ullrich, Albrecht

    2010-09-01

    Shrinking structures, advanced optical proximity correction (OPC) and complex measurement strategies continually challenge critical dimension (CD) metrology tools and recipe creation processes. One important quality ensuring task is the control of measurement outlier behavior. Outliers could trigger false positive alarm for specification violations impacting cycle time or potentially yield. Constant high level of outliers not only deteriorates cycle time but also puts unnecessary stress on tool operators leading eventually to human errors. At tool level the sources of outliers are natural variations (e.g. beam current etc.), drifts, contrast conditions, focus determination or pattern recognition issues, etc. Some of these can result from suboptimal or even wrong recipe settings, like focus position or measurement box size. Such outliers, created by an automatic recipe creation process faced with more complicated structures, would manifest itself rather as systematic variation of measurements than the one caused by 'pure' tool variation. I analyzed several statistical methods to detect outliers. These range from classical outlier tests for extrema, robust metrics like interquartile range (IQR) to methods evaluating the distribution of different populations of measurement sites, like the Cochran test. The latter suits especially the detection of systematic effects. The next level of outlier detection entwines additional information about the mask and the manufacturing process with the measurement results. The methods were reviewed for measured variations assumed to be normally distributed with zero mean but also for the presence of a statistically significant spatial process signature. I arrive at the conclusion that intelligent outlier detection can influence the efficiency and cycle time of CD metrology greatly. In combination with process information like target, typical platform variation and signature, one can tailor the detection to the needs of the photomask at hand. By monitoring the outlier behavior carefully, weaknesses of the automatic recipe creation process can be spotted.

  14. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  15. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  16. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  17. Hail statistic in Western Europe based on a hyrid cell-tracking algorithm combining radar signals with hailstone observations

    NASA Astrophysics Data System (ADS)

    Fluck, Elody

    2015-04-01

    Hail statistic in Western Europe based on a hybrid cell-tracking algorithm combining radar signals with hailstone observations Elody Fluck¹, Michael Kunz¹ , Peter Geissbühler², Stefan P. Ritz² With hail damage estimated over Billions of Euros for a single event (e.g., hailstorm Andreas on 27/28 July 2013), hail constitute one of the major atmospheric risks in various parts of Europe. The project HAMLET (Hail Model for Europe) in cooperation with the insurance company Tokio Millennium Re aims at estimating hail probability, hail hazard and, combined with vulnerability, hail risk for several European countries (Germany, Switzerland, France, Netherlands, Austria, Belgium and Luxembourg). Hail signals are obtained from radar reflectivity since this proxy is available with a high temporal and spatial resolution using several hail proxies, especially radar data. The focus in the first step is on Germany and France for the periods 2005- 2013 and 1999 - 2013, respectively. In the next step, the methods will be transferred and extended to other regions. A cell-tracking algorithm TRACE2D was adjusted and applied to two dimensional radar reflectivity data from different radars operated by European weather services such as German weather service (DWD) and French weather service (Météo-France). Strong convective cells are detected by considering 3 connected pixels over 45 dBZ (Reflectivity Cores RCs) in a radar scan. Afterwards, the algorithm tries to find the same RCs in the next 5 minute radar scan and, thus, track the RCs centers over time and space. Additional information about hailstone diameters provided by ESWD (European Severe Weather Database) is used to determine hail intensity of the detected hail swaths. Maximum hailstone diameters are interpolated along and close to the individual hail tracks giving an estimation of mean diameters for the detected hail swaths. Furthermore, a stochastic event set is created by randomizing the parameters obtained from the tracking approach of the historical event catalogue (length, width, orientation, diameter). This stochastic event set will be used to quantify hail risk and to estimate probable maximum loss (e.g., PML200) for a given industry motor or property (building) portfolio.

  18. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.

  19. Passive in-vehicle driver breath alcohol detection using advanced sensor signal acquisition and fusion.

    PubMed

    Ljungblad, Jonas; Hök, Bertil; Allalou, Amin; Pettersson, Håkan

    2017-05-29

    The research objective of the present investigation is to demonstrate the present status of passive in-vehicle driver breath alcohol detection and highlight the necessary conditions for large-scale implementation of such a system. Completely passive detection has remained a challenge mainly because of the requirements on signal resolution combined with the constraints of vehicle integration. The work is part of the Driver Alcohol Detection System for Safety (DADSS) program aiming at massive deployment of alcohol sensing systems that could potentially save thousands of American lives annually. The work reported here builds on earlier investigations, in which it has been shown that detection of alcohol vapor in the proximity of a human subject may be traced to that subject by means of simultaneous recording of carbon dioxide (CO 2 ) at the same location. Sensors based on infrared spectroscopy were developed to detect and quantify low concentrations of alcohol and CO 2 . In the present investigation, alcohol and CO 2 were recorded at various locations in a vehicle cabin while human subjects were performing normal in-step procedures and driving preparations. A video camera directed to the driver position was recording images of the driver's upper body parts, including the face, and the images were analyzed with respect to features of significance to the breathing behavior and breath detection, such as mouth opening and head direction. Improvement of the sensor system with respect to signal resolution including algorithm and software development, and fusion of the sensor and camera signals was successfully implemented and tested before starting the human study. In addition, experimental tests and simulations were performed with the purpose of connecting human subject data with repeatable experimental conditions. The results include occurrence statistics of detected breaths by signal peaks of CO 2 and alcohol. From the statistical data, the accuracy of breath alcohol estimation and timing related to initial driver routines (door opening, taking a seat, door closure, buckling up, etc.) can be estimated. The investigation confirmed the feasibility of passive driver breath alcohol detection using our present system. Trade-offs between timing and sensor signal resolution requirements will become critical. Further improvement of sensor resolution and system ruggedness is required before the results can be industrialized. It is concluded that a further important step toward completely passive detection of driver breath alcohol has been taken. If required, the sniffer function with alcohol detection capability can be combined with a subsequent highly accurate breath test to confirm the driver's legal status using the same sensor device. The study is relevant to crash avoidance, in particular driver monitoring systems and driver-vehicle interface design.

  20. Statistics for characterizing data on the periphery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James P; Hush, Donald R

    2010-01-01

    We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.

  1. Optimal filtering and Bayesian detection for friction-based diagnostics in machines.

    PubMed

    Ray, L R; Townsend, J R; Ramasubramanian, A

    2001-01-01

    Non-model-based diagnostic methods typically rely on measured signals that must be empirically related to process behavior or incipient faults. The difficulty in interpreting a signal that is indirectly related to the fundamental process behavior is significant. This paper presents an integrated non-model and model-based approach to detecting when process behavior varies from a proposed model. The method, which is based on nonlinear filtering combined with maximum likelihood hypothesis testing, is applicable to dynamic systems whose constitutive model is well known, and whose process inputs are poorly known. Here, the method is applied to friction estimation and diagnosis during motion control in a rotating machine. A nonlinear observer estimates friction torque in a machine from shaft angular position measurements and the known input voltage to the motor. The resulting friction torque estimate can be analyzed directly for statistical abnormalities, or it can be directly compared to friction torque outputs of an applicable friction process model in order to diagnose faults or model variations. Nonlinear estimation of friction torque provides a variable on which to apply diagnostic methods that is directly related to model variations or faults. The method is evaluated experimentally by its ability to detect normal load variations in a closed-loop controlled motor driven inertia with bearing friction and an artificially-induced external line contact. Results show an ability to detect statistically significant changes in friction characteristics induced by normal load variations over a wide range of underlying friction behaviors.

  2. Classification of the medicinal plants of the genus Atractylodes using high-performance liquid chromatography with diode array and tandem mass spectrometry detection combined with multivariate statistical analysis.

    PubMed

    Cho, Hyun-Deok; Kim, Unyong; Suh, Joon Hyuk; Eom, Han Young; Kim, Junghyun; Lee, Seul Gi; Choi, Yong Seok; Han, Sang Beom

    2016-04-01

    Analytical methods using high-performance liquid chromatography with diode array and tandem mass spectrometry detection were developed for the discrimination of the rhizomes of four Atractylodes medicinal plants: A. japonica, A. macrocephala, A. chinensis, and A. lancea. A quantitative study was performed, selecting five bioactive components, including atractylenolide I, II, III, eudesma-4(14),7(11)-dien-8-one and atractylodin, on twenty-six Atractylodes samples of various origins. Sample extraction was optimized to sonication with 80% methanol for 40 min at room temperature. High-performance liquid chromatography with diode array detection was established using a C18 column with a water/acetonitrile gradient system at a flow rate of 1.0 mL/min, and the detection wavelength was set at 236 nm. Liquid chromatography with tandem mass spectrometry was applied to certify the reliability of the quantitative results. The developed methods were validated by ensuring specificity, linearity, limit of quantification, accuracy, precision, recovery, robustness, and stability. Results showed that cangzhu contained higher amounts of atractylenolide I and atractylodin than baizhu, and especially atractylodin contents showed the greatest variation between baizhu and cangzhu. Multivariate statistical analysis, such as principal component analysis and hierarchical cluster analysis, were also employed for further classification of the Atractylodes plants. The established method was suitable for quality control of the Atractylodes plants. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.

  4. Tanshinone IIA combined with adriamycin inhibited malignant biological behaviors of NSCLC A549 cell line in a synergistic way.

    PubMed

    Xie, Jun; Liu, Jia-Hui; Liu, Heng; Liao, Xiao-Zhong; Chen, Yuling; Lin, Mei-Gui; Gu, Yue-Yu; Liu, Tao-Li; Wang, Dong-Mei; Ge, Hui; Mo, Sui-Lin

    2016-11-18

    The study was designed to develop a platform to verify whether the extract of herbs combined with chemotherapy drugs play a synergistic role in anti-tumor effects, and to provide experimental evidence and theoretical reference for finding new effective sensitizers. Inhibition of tanshinone IIA and adriamycin on the proliferation of A549, PC9 and HLF cells were assessed by CCK8 assays. The combination index (CI) was calculated with the Chou-Talalay method, based on the median-effect principle. Migration and invasion ability of A549 cells were determined by wound healing assay and transwell assay. Flow cytometry was used to detect the cell apoptosis and the distribution of cell cycles. TUNEL staining was used to detect the apoptotic cells. Immunofluorescence staining was used to detect the expression of Cleaved Caspase-3. Western blotting was used to detect the proteins expression of relative apoptotic signal pathways. CDOCKER module in DS 2.5 was used to detect the binding modes of the drugs and the proteins. Both tanshinone IIA and adriamycin could inhibit the growth of A549, PC9, and HLF cells in a dose- and time-dependent manner, while the proliferative inhibition effect of tanshinone IIA on cells was much weaker than that of adriamycin. Different from the cancer cells, HLF cells displayed a stronger sensitivity to adriamycin, and a weaker sensitivity to tanshinone IIA. When tanshinone IIA combined with adriamycin at a ratio of 20:1, they exhibited a synergistic anti-proliferation effect on A549 and PC9 cells, but not in HLF cells. Tanshinone IIA combined with adriamycin could synergistically inhibit migration, induce apoptosis and arrest cell cycle at the S and G2 phases in A549 cells. Both groups of the single drug treatment and the drug combination up-regulated the expressions of Cleaved Caspase-3 and Bax, but down-regulated the expressions of VEGF, VEGFR2, p-PI3K, p-Akt, Bcl-2, and Caspase-3 protein. Compared with the single drug treatment groups, the drug combination groups were more statistically significant. The molecular docking algorithms indicated that tanshinone IIA could be docked into the active sites of all the tested proteins with H-bond and aromatic interactions, compared with that of adriamycin. Tanshinone IIA can be developed as a novel agent in the postoperative adjuvant therapy combined with other anti-tumor agents, and improve the sensibility of chemotherapeutics for non-small cell lung cancer with fewer side effects. In addition, this experiment can not only provide a reference for the development of more effective anti-tumor medicine ingredients, but also build a platform for evaluating the anti-tumor effects of Chinese herbal medicines in combination with chemotherapy drugs.

  5. Automatic detection of the macula in retinal fundus images using seeded mode tracking approach.

    PubMed

    Wong, Damon W K; Liu, Jiang; Tan, Ngan-Meng; Yin, Fengshou; Cheng, Xiangang; Cheng, Ching-Yu; Cheung, Gemmy C M; Wong, Tien Yin

    2012-01-01

    The macula is the part of the eye responsible for central high acuity vision. Detection of the macula is an important task in retinal image processing as a landmark for subsequent disease assessment, such as for age-related macula degeneration. In this paper, we have presented an approach to automatically determine the macula centre in retinal fundus images. First contextual information on the image is combined with a statistical model to obtain an approximate macula region of interest localization. Subsequently, we propose the use of a seeded mode tracking technique to locate the macula centre. The proposed approach is tested on a large dataset composed of 482 normal images and 162 glaucoma images from the ORIGA database and an additional 96 AMD images. The results show a ROI detection of 97.5%, and 90.5% correct detection of the macula within 1/3DD from a manual reference, which outperforms other current methods. The results are promising for the use of the proposed approach to locate the macula for the detection of macula diseases from retinal images.

  6. TU-FG-201-03: Automatic Pre-Delivery Verification Using Statistical Analysis of Consistencies in Treatment Plan Parameters by the Treatment Site and Modality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, S; Wu, Y; Chang, X

    Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans ofmore » the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the Agency for Healthcare Research and Quality (AHRQ) under award 1R01HS0222888. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less

  7. Intraoperative end-tidal concentration of isoflurane in cats undergoing ovariectomy that received tramadol, buprenorphine or a combination of both.

    PubMed

    Bellini, Luca; Mollo, Antonio; Contiero, Barbara; Busetto, Roberto

    2017-02-01

    Objectives The aim of the study was to evaluate the end-tidal concentration of isoflurane required to maintain heart and respiratory rate within ± 20% of basal measurement in cats undergoing ovariectomy that received buprenorphine, tramadol or a combination of both. Methods Thirty cats, divided into three groups, were enrolled in a simple operator-blinded, randomised study. Cats received acepromazine (0.03 mg/kg) and one of the following treatments: buprenorphine (0.02 mg/kg), tramadol (2 mg/kg) or a combination of both. Anaesthesia was induced with propofol and maintained with isoflurane titrated in order to maintain heart and respiratory rate within the target values recorded before premedication. Results Groups were similar for age, weight, dose of propofol administered, sedation and recovery scores. Cats receiving tramadol with buprenorphine were extubated earlier after isoflurane discontinuation. No statistical differences were detected in end-tidal fraction of isoflurane between buprenorphine alone or with tramadol. In cats that received tramadol or buprenorphine alone, ovarian pedicle traction caused a statistical increase in end-tidal isoflurane concentration compared with that measured during incision and suture of the skin. In cats that received the combination of tramadol plus buprenorphine no differences among surgical time points were observed. Conclusions and relevance Tramadol added to buprenorphine did not provide any advantage in decreasing the end-tidal fraction of isoflurane compared with buprenorphine alone, although it is speculated there may be an infra-additive interaction between tramadol and buprenorphine in cats.

  8. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    PubMed

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  9. Modularity-like objective function in annotated networks

    NASA Astrophysics Data System (ADS)

    Xie, Jia-Rong; Wang, Bing-Hong

    2017-12-01

    We ascertain the modularity-like objective function whose optimization is equivalent to the maximum likelihood in annotated networks. We demonstrate that the modularity-like objective function is a linear combination of modularity and conditional entropy. In contrast with statistical inference methods, in our method, the influence of the metadata is adjustable; when its influence is strong enough, the metadata can be recovered. Conversely, when it is weak, the detection may correspond to another partition. Between the two, there is a transition. This paper provides a concept for expanding the scope of modularity methods.

  10. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  11. Revealing cell cycle control by combining model-based detection of periodic expression with novel cis-regulatory descriptors

    PubMed Central

    Andersson, Claes R; Hvidsten, Torgeir R; Isaksson, Anders; Gustafsson, Mats G; Komorowski, Jan

    2007-01-01

    Background We address the issue of explaining the presence or absence of phase-specific transcription in budding yeast cultures under different conditions. To this end we use a model-based detector of gene expression periodicity to divide genes into classes depending on their behavior in experiments using different synchronization methods. While computational inference of gene regulatory circuits typically relies on expression similarity (clustering) in order to find classes of potentially co-regulated genes, this method instead takes advantage of known time profile signatures related to the studied process. Results We explain the regulatory mechanisms of the inferred periodic classes with cis-regulatory descriptors that combine upstream sequence motifs with experimentally determined binding of transcription factors. By systematic statistical analysis we show that periodic classes are best explained by combinations of descriptors rather than single descriptors, and that different combinations correspond to periodic expression in different classes. We also find evidence for additive regulation in that the combinations of cis-regulatory descriptors associated with genes periodically expressed in fewer conditions are frequently subsets of combinations associated with genes periodically expression in more conditions. Finally, we demonstrate that our approach retrieves combinations that are more specific towards known cell-cycle related regulators than the frequently used clustering approach. Conclusion The results illustrate how a model-based approach to expression analysis may be particularly well suited to detect biologically relevant mechanisms. Our new approach makes it possible to provide more refined hypotheses about regulatory mechanisms of the cell cycle and it can easily be adjusted to reveal regulation of other, non-periodic, cellular processes. PMID:17939860

  12. Predicting Flowering Behavior and Exploring Its Genetic Determinism in an Apple Multi-family Population Based on Statistical Indices and Simplified Phenotyping.

    PubMed

    Durand, Jean-Baptiste; Allard, Alix; Guitton, Baptiste; van de Weg, Eric; Bink, Marco C A M; Costes, Evelyne

    2017-01-01

    Irregular flowering over years is commonly observed in fruit trees. The early prediction of tree behavior is highly desirable in breeding programmes. This study aims at performing such predictions, combining simplified phenotyping and statistics methods. Sequences of vegetative vs. floral annual shoots (AS) were observed along axes in trees belonging to five apple related full-sib families. Sequences were analyzed using Markovian and linear mixed models including year and site effects. Indices of flowering irregularity, periodicity and synchronicity were estimated, at tree and axis scales. They were used to predict tree behavior and detect QTL with a Bayesian pedigree-based analysis, using an integrated genetic map containing 6,849 SNPs. The combination of a Biennial Bearing Index (BBI) with an autoregressive coefficient (γ g ) efficiently predicted and classified the genotype behaviors, despite few misclassifications. Four QTLs common to BBIs and γ g and one for synchronicity were highlighted and revealed the complex genetic architecture of the traits. Irregularity resulted from high AS synchronism, whereas regularity resulted from either asynchronous locally alternating or continual regular AS flowering. A relevant and time-saving method, based on a posteriori sampling of axes and statistical indices is proposed, which is efficient to evaluate the tree breeding values for flowering regularity and could be transferred to other species.

  13. Comparison of non-invasive assessment to diagnose liver fibrosis in chronic hepatitis B and C patients.

    PubMed

    Stibbe, Krista J M; Verveer, Claudia; Francke, Jan; Hansen, Bettina E; Zondervan, Pieter E; Kuipers, Ernst J; de Knegt, Robert J; van Vuuren, Anneke J

    2011-07-01

    Chronic viral hepatitis B and C cause liver fibrosis, leading to cirrhosis. Fibrosis assessment is essential to establish prognosis and treatment indication. We compared seven non-invasive tests, separately and in combination, in chronic hepatitis patients to detect early stages of fibrosis according to the Metavir score in liver biopsy. Galactose and methacetin breath tests (GBT and MBT), biomarkers (hyaluronic acid (HA), aspartate aminotransferase platelet ratio index (APRI), FibroTest, and Fib-4) and transient elastography (TE) were evaluated in 89 patients. Additionally, 31 healthy controls were included for evaluation of breath tests and biomarkers. Serum markers (HA, APRI, FibroTest, and Fib-4) and elastography significantly distinguished non-cirrhotic (F0123) from cirrhotic (F4) patients (p < 0.001, p = 0.015, p < 0.001, p = 0.005, p = 0.006, respectively). GBT, HA, APRI, FibroTest, Fib-4, and TE detected F01 from F234 (p = 0.04, p = 0.011, p = 0.009, p < 0.001, p < 0.001, and p < 0.001, respectively). A combination of different tests (TE, HA, and FibroTest) improved the performance statistically, area under the curve (AUC) = 0.87 for F234, 0.92 for F34, and 0.90 for F4. HA, APRI, FibroTest, Fib-4, and TE reliably distinguish non-cirrhotic and cirrhotic patients. Except for MBT, all tests discriminate between mild and moderate fibrosis. As single tests: FibroTest, Fib-4, and TE were the most accurate for detecting early fibrosis; combining different non-invasive tests increased the accuracy for detection of liver fibrosis to such an extent and thus might be acceptable to replace liver biopsy.

  14. Detecting epistasis with the marginal epistasis test in genetic mapping studies of quantitative traits

    PubMed Central

    Zeng, Ping; Mukherjee, Sayan; Zhou, Xiang

    2017-01-01

    Epistasis, commonly defined as the interaction between multiple genes, is an important genetic component underlying phenotypic variation. Many statistical methods have been developed to model and identify epistatic interactions between genetic variants. However, because of the large combinatorial search space of interactions, most epistasis mapping methods face enormous computational challenges and often suffer from low statistical power due to multiple test correction. Here, we present a novel, alternative strategy for mapping epistasis: instead of directly identifying individual pairwise or higher-order interactions, we focus on mapping variants that have non-zero marginal epistatic effects—the combined pairwise interaction effects between a given variant and all other variants. By testing marginal epistatic effects, we can identify candidate variants that are involved in epistasis without the need to identify the exact partners with which the variants interact, thus potentially alleviating much of the statistical and computational burden associated with standard epistatic mapping procedures. Our method is based on a variance component model, and relies on a recently developed variance component estimation method for efficient parameter inference and p-value computation. We refer to our method as the “MArginal ePIstasis Test”, or MAPIT. With simulations, we show how MAPIT can be used to estimate and test marginal epistatic effects, produce calibrated test statistics under the null, and facilitate the detection of pairwise epistatic interactions. We further illustrate the benefits of MAPIT in a QTL mapping study by analyzing the gene expression data of over 400 individuals from the GEUVADIS consortium. PMID:28746338

  15. Contextual Interactions in Grating Plaid Configurations Are Explained by Natural Image Statistics and Neural Modeling

    PubMed Central

    Ernst, Udo A.; Schiffer, Alina; Persike, Malte; Meinhardt, Günter

    2016-01-01

    Processing natural scenes requires the visual system to integrate local features into global object descriptions. To achieve coherent representations, the human brain uses statistical dependencies to guide weighting of local feature conjunctions. Pairwise interactions among feature detectors in early visual areas may form the early substrate of these local feature bindings. To investigate local interaction structures in visual cortex, we combined psychophysical experiments with computational modeling and natural scene analysis. We first measured contrast thresholds for 2 × 2 grating patch arrangements (plaids), which differed in spatial frequency composition (low, high, or mixed), number of grating patch co-alignments (0, 1, or 2), and inter-patch distances (1° and 2° of visual angle). Contrast thresholds for the different configurations were compared to the prediction of probability summation (PS) among detector families tuned to the four retinal positions. For 1° distance the thresholds for all configurations were larger than predicted by PS, indicating inhibitory interactions. For 2° distance, thresholds were significantly lower compared to PS when the plaids were homogeneous in spatial frequency and orientation, but not when spatial frequencies were mixed or there was at least one misalignment. Next, we constructed a neural population model with horizontal laminar structure, which reproduced the detection thresholds after adaptation of connection weights. Consistent with prior work, contextual interactions were medium-range inhibition and long-range, orientation-specific excitation. However, inclusion of orientation-specific, inhibitory interactions between populations with different spatial frequency preferences were crucial for explaining detection thresholds. Finally, for all plaid configurations we computed their likelihood of occurrence in natural images. The likelihoods turned out to be inversely related to the detection thresholds obtained at larger inter-patch distances. However, likelihoods were almost independent of inter-patch distance, implying that natural image statistics could not explain the crowding-like results at short distances. This failure of natural image statistics to resolve the patch distance modulation of plaid visibility remains a challenge to the approach. PMID:27757076

  16. Tables of square-law signal detection statistics for Hann spectra with 50 percent overlap

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. Kent

    1991-01-01

    The Search for Extraterrestrial Intelligence, currently being planned by NASA, will require that an enormous amount of data be analyzed in real time by special purpose hardware. It is expected that overlapped Hann data windows will play an important role in this analysis. In order to understand the statistical implication of this approach, it has been necessary to compute detection statistics for overlapped Hann spectra. Tables of signal detection statistics are given for false alarm rates from 10(exp -14) to 10(exp -1) and signal detection probabilities from 0.50 to 0.99; the number of computed spectra ranges from 4 to 2000.

  17. A space-time scan statistic for detecting emerging outbreaks.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko; Kohriyama, Kazuaki

    2011-03-01

    As a major analytical method for outbreak detection, Kulldorff's space-time scan statistic (2001, Journal of the Royal Statistical Society, Series A 164, 61-72) has been implemented in many syndromic surveillance systems. Since, however, it is based on circular windows in space, it has difficulty correctly detecting actual noncircular clusters. Takahashi et al. (2008, International Journal of Health Geographics 7, 14) proposed a flexible space-time scan statistic with the capability of detecting noncircular areas. It seems to us, however, that the detection of the most likely cluster defined in these space-time scan statistics is not the same as the detection of localized emerging disease outbreaks because the former compares the observed number of cases with the conditional expected number of cases. In this article, we propose a new space-time scan statistic which compares the observed number of cases with the unconditional expected number of cases, takes a time-to-time variation of Poisson mean into account, and implements an outbreak model to capture localized emerging disease outbreaks more timely and correctly. The proposed models are illustrated with data from weekly surveillance of the number of absentees in primary schools in Kitakyushu-shi, Japan, 2006. © 2010, The International Biometric Society.

  18. Toward unsupervised outbreak detection through visual perception of new patterns

    PubMed Central

    Lévy, Pierre P; Valleron, Alain-Jacques

    2009-01-01

    Background Statistical algorithms are routinely used to detect outbreaks of well-defined syndromes, such as influenza-like illness. These methods cannot be applied to the detection of emerging diseases for which no preexisting information is available. This paper presents a method aimed at facilitating the detection of outbreaks, when there is no a priori knowledge of the clinical presentation of cases. Methods The method uses a visual representation of the symptoms and diseases coded during a patient consultation according to the International Classification of Primary Care 2nd version (ICPC-2). The surveillance data are transformed into color-coded cells, ranging from white to red, reflecting the increasing frequency of observed signs. They are placed in a graphic reference frame mimicking body anatomy. Simple visual observation of color-change patterns over time, concerning a single code or a combination of codes, enables detection in the setting of interest. Results The method is demonstrated through retrospective analyses of two data sets: description of the patients referred to the hospital by their general practitioners (GPs) participating in the French Sentinel Network and description of patients directly consulting at a hospital emergency department (HED). Informative image color-change alert patterns emerged in both cases: the health consequences of the August 2003 heat wave were visualized with GPs' data (but passed unnoticed with conventional surveillance systems), and the flu epidemics, which are routinely detected by standard statistical techniques, were recognized visually with HED data. Conclusion Using human visual pattern-recognition capacities to detect the onset of unexpected health events implies a convenient image representation of epidemiological surveillance and well-trained "epidemiology watchers". Once these two conditions are met, one could imagine that the epidemiology watchers could signal epidemiological alerts, based on "image walls" presenting the local, regional and/or national surveillance patterns, with specialized field epidemiologists assigned to validate the signals detected. PMID:19515246

  19. Investigating different skin and gastrointestinal tract (GIT) pathologies ex vivo by autofluorescence spectroscopy and optical imaging

    NASA Astrophysics Data System (ADS)

    Zhelyazkova, A.; Kuzmina, I.; Borisova, E.; Penkov, N.; Genova, Ts.; Spigulis, J.; Avramov, L.

    2016-01-01

    The skin neoplasias are on a second place in the world statistics of cancer incidence, and gastrointestinal tract (GIT) tumours are also in the "top ten" list. For the most of cutaneous and gastrointestinal tumours could be obtained better prognoses for patients, if an earlier and precise diagnostics procedure is applied. One of the most promising approaches for development of improved diagnostic techniques, is based on optical detection, and analysis of the signatures of biological tissues for detecting the presence of pathological alterations in the investigated objects. It is important to develop and combine novel diagnostic techniques for an accurate early stage diagnosis to improve the chances for skin and GIT tumours treatment. Optical techniques are very promising methods for such noninvasive diagnosis of skin and mucosa tumours, possessing the advantages of deep imaging depth, high resolution, fast imaging speed, and noninvasive character of detection. In this study we combine autofluorescence spectroscopy and optical imaging techniques to develop more precise evaluation of the tissue pathologies investigated. We obtain chromophore maps for GIT and cutaneous samples, with better visualization of the tumours borders and margins. In addition, fluorescence spectra give us information about the early changes in chromophores' contents into the tissues during neoplasia growth.

  20. An Optimal Bahadur-Efficient Method in Detection of Sparse Signals with Applications to Pathway Analysis in Sequencing Association Studies.

    PubMed

    Dai, Hongying; Wu, Guodong; Wu, Michael; Zhi, Degui

    2016-01-01

    Next-generation sequencing data pose a severe curse of dimensionality, complicating traditional "single marker-single trait" analysis. We propose a two-stage combined p-value method for pathway analysis. The first stage is at the gene level, where we integrate effects within a gene using the Sequence Kernel Association Test (SKAT). The second stage is at the pathway level, where we perform a correlated Lancaster procedure to detect joint effects from multiple genes within a pathway. We show that the Lancaster procedure is optimal in Bahadur efficiency among all combined p-value methods. The Bahadur efficiency,[Formula: see text], compares sample sizes among different statistical tests when signals become sparse in sequencing data, i.e. ε →0. The optimal Bahadur efficiency ensures that the Lancaster procedure asymptotically requires a minimal sample size to detect sparse signals ([Formula: see text]). The Lancaster procedure can also be applied to meta-analysis. Extensive empirical assessments of exome sequencing data show that the proposed method outperforms Gene Set Enrichment Analysis (GSEA). We applied the competitive Lancaster procedure to meta-analysis data generated by the Global Lipids Genetics Consortium to identify pathways significantly associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, and total cholesterol.

  1. Chromosome aberration analysis in atomic bomb survivors and Thorotrast patients using two- and three-colour chromosome painting of chromosomal subsets.

    PubMed

    Tanaka, K; Popp, S; Fischer, C; Van Kaick, G; Kamada, N; Cremer, T; Cremer, C

    1996-07-01

    Chromosomal translocations in peripheral lymphocytes of three healthy Hiroshima atomic (A)-bomb survivors, as well as three Thorotrast patients and two non-irradiated age-matched control persons from the German Thorotrast study were studied by two- and three-colour fluorescence in situ hybridization (chromosome painting) with various combinations of whole chromosome composite probes, including chromosomes 1, 2, 3, 4, 6, 7, 8, 9 and 12. Translocation frequencies detected by chromosome painting in cells of the A-bomb survivors were compared with results obtained by G-banding. A direct comparison was made, i.e. only those cells with simple translocations or complex aberrations detected by G-banding were taken into consideration which in principle could be detected also with the respective painting combination. The statistical analysis revealed no significant differences from a 1:1 relationship between the frequencies of aberrant cells obtained by both methods. The use of genomic translocation frequencies estimated from subsets of chromosomes for biological dosimetry is discussed in the light of evidence that chromosomes occupy distinct territories and are variably arranged in human lymphocyte nuclei. This territorial organization of interphase chromosomes implies that translocations will be restricted to chromatin located at the periphery of adjacent chromosome territories.

  2. Automatic Detection of Regions in Spinach Canopies Responding to Soil Moisture Deficit Using Combined Visible and Thermal Imagery

    PubMed Central

    Raza, Shan-e-Ahmed; Smith, Hazel K.; Clarkson, Graham J. J.; Taylor, Gail; Thompson, Andrew J.; Clarkson, John; Rajpoot, Nasir M.

    2014-01-01

    Thermal imaging has been used in the past for remote detection of regions of canopy showing symptoms of stress, including water deficit stress. Stress indices derived from thermal images have been used as an indicator of canopy water status, but these depend on the choice of reference surfaces and environmental conditions and can be confounded by variations in complex canopy structure. Therefore, in this work, instead of using stress indices, information from thermal and visible light imagery was combined along with machine learning techniques to identify regions of canopy showing a response to soil water deficit. Thermal and visible light images of a spinach canopy with different levels of soil moisture were captured. Statistical measurements from these images were extracted and used to classify between canopies growing in well-watered soil or under soil moisture deficit using Support Vector Machines (SVM) and Gaussian Processes Classifier (GPC) and a combination of both the classifiers. The classification results show a high correlation with soil moisture. We demonstrate that regions of a spinach crop responding to soil water deficit can be identified by using machine learning techniques with a high accuracy of 97%. This method could, in principle, be applied to any crop at a range of scales. PMID:24892284

  3. Combination of DNA-based and conventional methods to detect human leukocyte antigen polymorphism and its use for paternity testing.

    PubMed

    Kereszturya, László; Rajczya, Katalin; Lászikb, András; Gyódia, Eva; Pénzes, Mária; Falus, András; Petrányia, Gyõzõ G

    2002-03-01

    In cases of disputed paternity, the scientific goal is to promote either the exclusion of a falsely accused man or the affiliation of the alleged father. Until now, in addition to anthropologic characteristics, the determination of genetic markers included human leukocyte antigen gene variants; erythrocyte antigens and serum proteins were used for that reason. Recombinant DNA techniques provided a new set of highly variable genetic markers based on DNA nucleotide sequence polymorphism. From the practical standpoint, the application of these techniques to paternity testing provides greater versatility than do conventional genetic marker systems. The use of methods to detect the polymorphism of human leukocyte antigen loci significantly increases the chance of validation of ambiguous results in paternity testing. The outcome of 2384 paternity cases investigated by serologic and/or DNA-based human leukocyte antigen typing was statistically analyzed. Different cases solved by DNA typing are presented involving cases with one or two accused men, exclusions and nonexclusions, and tests of the paternity of a deceased man. The results provide evidence for the advantage of the combined application of various techniques in forensic diagnostics and emphasizes the outstanding possibilities of DNA-based assays. Representative examples demonstrate the strength of combined techniques in paternity testing.

  4. Simplified estimation of age-specific reference intervals for skewed data.

    PubMed

    Wright, E M; Royston, P

    1997-12-30

    Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.

  5. Comparison of statistical methods for detection of serum lipid biomarkers for mesothelioma and asbestos exposure.

    PubMed

    Xu, Rengyi; Mesaros, Clementina; Weng, Liwei; Snyder, Nathaniel W; Vachani, Anil; Blair, Ian A; Hwang, Wei-Ting

    2017-07-01

    We compared three statistical methods in selecting a panel of serum lipid biomarkers for mesothelioma and asbestos exposure. Serum samples from mesothelioma, asbestos-exposed subjects and controls (40 per group) were analyzed. Three variable selection methods were considered: top-ranked predictors from univariate model, stepwise and least absolute shrinkage and selection operator. Crossed-validated area under the receiver operating characteristic curve was used to compare the prediction performance. Lipids with high crossed-validated area under the curve were identified. Lipid with mass-to-charge ratio of 372.31 was selected by all three methods comparing mesothelioma versus control. Lipids with mass-to-charge ratio of 1464.80 and 329.21 were selected by two models for asbestos exposure versus control. Different methods selected a similar set of serum lipids. Combining candidate biomarkers can improve prediction.

  6. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  7. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  8. Resource-constrained Data Collection and Fusion for Identifying Weak Distributed Patterns in Networks

    DTIC Science & Technology

    2013-10-15

    statistic,” in Artifical Intelligence and Statistics (AISTATS), 2013. [6] ——, “Detecting activity in graphs via the Graph Ellipsoid Scan Statistic... Artifical Intelligence and Statistics (AISTATS), 2013. [8] ——, “Near-optimal anomaly detection in graphs using Lovász Extended Scan Statistic,” in Neural...networks,” in Artificial Intelligence and Statistics (AISTATS), 2010. 11 [11] D. Aldous, “The random walk construction of uniform spanning trees and

  9. Atomic Bose-Hubbard Systems with Single-Particle Control

    NASA Astrophysics Data System (ADS)

    Preiss, Philipp Moritz

    Experiments with ultracold atoms in optical lattices provide outstanding opportunities to realize exotic quantum states due to a high degree of tunability and control. In this thesis, I present experiments that extend this control from global parameters to the level of individual particles. Using a quantum gas microscope for 87Rb, we have developed a single-site addressing scheme based on digital amplitude holograms. The system self-corrects for aberrations in the imaging setup and creates arbitrary beam profiles. We are thus able to shape optical potentials on the scale of single lattice sites and control the dynamics of individual atoms. We study the role of quantum statistics and interactions in the Bose-Hubbard model on the fundamental level of two particles. Bosonic quantum statistics are apparent in the Hong-Ou-Mandel interference of massive particles, which we observe in tailored double-well potentials. These underlying statistics, in combination with tunable repulsive interactions, dominate the dynamics in single- and two-particle quantum walks. We observe highly coherent position-space Bloch oscillations, bosonic bunching in Hanbury Brown-Twiss interference and the fermionization of strongly interacting bosons. Many-body states of indistinguishable quantum particles are characterized by large-scale spatial entanglement, which is difficult to detect in itinerant systems. Here, we extend the concept of Hong-Ou-Mandel interference from individual particles to many-body states to directly quantify entanglement entropy. We perform collective measurements on two copies of a quantum state and detect entanglement entropy through many-body interference. We measure the second order Renyi entropy in small Bose-Hubbard systems and detect the buildup of spatial entanglement across the superfluid-insulator transition. Our experiments open new opportunities for the single-particle-resolved preparation and characterization of many-body quantum states.

  10. The effect of ion-exchange purification on the determination of plutonium at the New Brunswick Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, W.G.; Spaletto, M.I.; Lewis, K.

    The method of plutonium (Pu) determination at the Brunswick Laboratory (NBL) consists of a combination of ion-exchange purification followed by controlled-potential coulometric analysis (IE/CPC). The present report's purpose is to quantify any detectable Pu loss occurring in the ion-exchange (IE) purification step which would cause a negative bias in the NBL method for Pu analysis. The magnitude of any such loss would be contained within the reproducibility (0.05%) of the IE/CPC method which utilizes a state-of-the-art autocoulometer developed at NBL. When the NBL IE/CPC method is used for Pu analysis, any loss in ion-exchange purification (<0.05%) is confounded with themore » repeatability of the ion-exchange and the precision of the CPC analysis technique (<0.05%). Consequently, to detect a bias in the IE/CPC method due to the IE alone using the IE/CPC method itself requires that many randomized analyses on a single material be performed over time and that statistical analysis of the data be performed. The initial approach described in this report to quantify any IE loss was an independent method, Isotope Dilution Mass Spectrometry; however, the number of analyses performed was insufficient to assign a statistically significant value to the IE loss (<0.02% of 10 mg samples of Pu). The second method used for quantifying any IE loss of Pu was multiple ion exchanges of the same Pu aliquant; the small number of analyses possible per individual IE together with the column-to-column variability over multiple ion exchanges prevented statistical detection of any loss of <0.05%. 12 refs.« less

  11. Biomarker combinations for diagnosis and prognosis in multicenter studies: Principles and methods.

    PubMed

    Meisner, Allison; Parikh, Chirag R; Kerr, Kathleen F

    2017-01-01

    Many investigators are interested in combining biomarkers to predict a binary outcome or detect underlying disease. This endeavor is complicated by the fact that many biomarker studies involve data from multiple centers. Depending upon the relationship between center, the biomarkers, and the target of prediction, care must be taken when constructing and evaluating combinations of biomarkers. We introduce a taxonomy to describe the role of center and consider how a biomarker combination should be constructed and evaluated. We show that ignoring center, which is frequently done by clinical researchers, is often not appropriate. The limited statistical literature proposes using random intercept logistic regression models, an approach that we demonstrate is generally inadequate and may be misleading. We instead propose using fixed intercept logistic regression, which appropriately accounts for center without relying on untenable assumptions. After constructing the biomarker combination, we recommend using performance measures that account for the multicenter nature of the data, namely the center-adjusted area under the receiver operating characteristic curve. We apply these methods to data from a multicenter study of acute kidney injury after cardiac surgery. Appropriately accounting for center, both in construction and evaluation, may increase the likelihood of identifying clinically useful biomarker combinations.

  12. Fusion of Local Statistical Parameters for Buried Underwater Mine Detection in Sonar Imaging

    NASA Astrophysics Data System (ADS)

    Maussang, F.; Rombaut, M.; Chanussot, J.; Hétet, A.; Amate, M.

    2008-12-01

    Detection of buried underwater objects, and especially mines, is a current crucial strategic task. Images provided by sonar systems allowing to penetrate in the sea floor, such as the synthetic aperture sonars (SASs), are of great interest for the detection and classification of such objects. However, the signal-to-noise ratio is fairly low and advanced information processing is required for a correct and reliable detection of the echoes generated by the objects. The detection method proposed in this paper is based on a data-fusion architecture using the belief theory. The input data of this architecture are local statistical characteristics extracted from SAS data corresponding to the first-, second-, third-, and fourth-order statistical properties of the sonar images, respectively. The interest of these parameters is derived from a statistical model of the sonar data. Numerical criteria are also proposed to estimate the detection performances and to validate the method.

  13. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic

    PubMed Central

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364

  14. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.

    PubMed

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.

  15. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  16. Statistical parsimony networks and species assemblages in Cephalotrichid nemerteans (nemertea).

    PubMed

    Chen, Haixia; Strand, Malin; Norenburg, Jon L; Sun, Shichun; Kajihara, Hiroshi; Chernyshev, Alexey V; Maslakova, Svetlana A; Sundberg, Per

    2010-09-21

    It has been suggested that statistical parsimony network analysis could be used to get an indication of species represented in a set of nucleotide data, and the approach has been used to discuss species boundaries in some taxa. Based on 635 base pairs of the mitochondrial protein-coding gene cytochrome c oxidase I (COI), we analyzed 152 nemertean specimens using statistical parsimony network analysis with the connection probability set to 95%. The analysis revealed 15 distinct networks together with seven singletons. Statistical parsimony yielded three networks supporting the species status of Cephalothrix rufifrons, C. major and C. spiralis as they currently have been delineated by morphological characters and geographical location. Many other networks contained haplotypes from nearby geographical locations. Cladistic structure by maximum likelihood analysis overall supported the network analysis, but indicated a false positive result where subnetworks should have been connected into one network/species. This probably is caused by undersampling of the intraspecific haplotype diversity. Statistical parsimony network analysis provides a rapid and useful tool for detecting possible undescribed/cryptic species among cephalotrichid nemerteans based on COI gene. It should be combined with phylogenetic analysis to get indications of false positive results, i.e., subnetworks that would have been connected with more extensive haplotype sampling.

  17. [Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].

    PubMed

    Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing

    2015-10-01

    Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.

  18. Detection of Single Standing Dead Trees from Aerial Color Infrared Imagery by Segmentation with Shape and Intensity Priors

    NASA Astrophysics Data System (ADS)

    Polewski, P.; Yao, W.; Heurich, M.; Krzystek, P.; Stilla, U.

    2015-03-01

    Standing dead trees, known as snags, are an essential factor in maintaining biodiversity in forest ecosystems. Combined with their role as carbon sinks, this makes for a compelling reason to study their spatial distribution. This paper presents an integrated method to detect and delineate individual dead tree crowns from color infrared aerial imagery. Our approach consists of two steps which incorporate statistical information about prior distributions of both the image intensities and the shapes of the target objects. In the first step, we perform a Gaussian Mixture Model clustering in the pixel color space with priors on the cluster means, obtaining up to 3 components corresponding to dead trees, living trees, and shadows. We then refine the dead tree regions using a level set segmentation method enriched with a generative model of the dead trees' shape distribution as well as a discriminative model of their pixel intensity distribution. The iterative application of the statistical shape template yields the set of delineated dead crowns. The prior information enforces the consistency of the template's shape variation with the shape manifold defined by manually labeled training examples, which makes it possible to separate crowns located in close proximity and prevents the formation of large crown clusters. Also, the statistical information built into the segmentation gives rise to an implicit detection scheme, because the shape template evolves towards an empty contour if not enough evidence for the object is present in the image. We test our method on 3 sample plots from the Bavarian Forest National Park with reference data obtained by manually marking individual dead tree polygons in the images. Our results are scenario-dependent and range from a correctness/completeness of 0.71/0.81 up to 0.77/1, with an average center-of-gravity displacement of 3-5 pixels between the detected and reference polygons.

  19. Shared Gene Expression Alterations in Nasal and Bronchial Epithelium for Lung Cancer Detection.

    PubMed

    2017-07-01

    We previously derived and validated a bronchial epithelial gene expression biomarker to detect lung cancer in current and former smokers. Given that bronchial and nasal epithelial gene expression are similarly altered by cigarette smoke exposure, we sought to determine if cancer-associated gene expression might also be detectable in the more readily accessible nasal epithelium. Nasal epithelial brushings were prospectively collected from current and former smokers undergoing diagnostic evaluation for pulmonary lesions suspicious for lung cancer in the AEGIS-1 (n = 375) and AEGIS-2 (n = 130) clinical trials and gene expression profiled using microarrays. All statistical tests were two-sided. We identified 535 genes that were differentially expressed in the nasal epithelium of AEGIS-1 patients diagnosed with lung cancer vs those with benign disease after one year of follow-up ( P  < .001). Using bronchial gene expression data from the AEGIS-1 patients, we found statistically significant concordant cancer-associated gene expression alterations between the two airway sites ( P  < .001). Differentially expressed genes in the nose were enriched for genes associated with the regulation of apoptosis and immune system signaling. A nasal lung cancer classifier derived in the AEGIS-1 cohort that combined clinical factors (age, smoking status, time since quit, mass size) and nasal gene expression (30 genes) had statistically significantly higher area under the curve (0.81; 95% confidence interval [CI] = 0.74 to 0.89, P  = .01) and sensitivity (0.91; 95% CI = 0.81 to 0.97, P  = .03) than a clinical-factor only model in independent samples from the AEGIS-2 cohort. These results support that the airway epithelial field of lung cancer-associated injury in ever smokers extends to the nose and demonstrates the potential of using nasal gene expression as a noninvasive biomarker for lung cancer detection. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants

    NASA Astrophysics Data System (ADS)

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  1. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants.

    PubMed

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  2. An expert panel-based study on recognition of gastro-esophageal reflux in difficult esophageal pH-impedance tracings.

    PubMed

    Smits, M J; Loots, C M; van Wijk, M P; Bredenoord, A J; Benninga, M A; Smout, A J P M

    2015-05-01

    Despite existing criteria for scoring gastro-esophageal reflux (GER) in esophageal multichannel pH-impedance measurement (pH-I) tracings, inter- and intra-rater variability is large and agreement with automated analysis is poor. To identify parameters of difficult to analyze pH-I patterns and combine these into a statistical model that can identify GER episodes with an international consensus as gold standard. Twenty-one experts from 10 countries were asked to mark GER presence for adult and pediatric pH-I patterns in an online pre-assessment. During a consensus meeting, experts voted on patterns not reaching majority consensus (>70% agreement). Agreement was calculated between raters, between consensus and individual raters, and between consensus and software generated automated analysis. With eight selected parameters, multiple logistic regression analysis was performed to describe an algorithm sensitive and specific for detection of GER. Majority consensus was reached for 35/79 episodes in the online pre-assessment (interrater κ = 0.332). Mean agreement between pre-assessment scores and final consensus was moderate (κ = 0.466). Combining eight pH-I parameters did not result in a statistically significant model able to identify presence of GER. Recognizing a pattern as retrograde is the best indicator of GER, with 100% sensitivity and 81% specificity with expert consensus as gold standard. Agreement between experts scoring difficult impedance patterns for presence or absence of GER is poor. Combining several characteristics into a statistical model did not improve diagnostic accuracy. Only the parameter 'retrograde propagation pattern' is an indicator of GER in difficult pH-I patterns. © 2015 John Wiley & Sons Ltd.

  3. Improving the detection of pathways in genome-wide association studies by combined effects of SNPs from Linkage Disequilibrium blocks.

    PubMed

    Zhao, Huiying; Nyholt, Dale R; Yang, Yuanhao; Wang, Jihua; Yang, Yuedong

    2017-06-14

    Genome-wide association studies (GWAS) have successfully identified single variants associated with diseases. To increase the power of GWAS, gene-based and pathway-based tests are commonly employed to detect more risk factors. However, the gene- and pathway-based association tests may be biased towards genes or pathways containing a large number of single-nucleotide polymorphisms (SNPs) with small P-values caused by high linkage disequilibrium (LD) correlations. To address such bias, numerous pathway-based methods have been developed. Here we propose a novel method, DGAT-path, to divide all SNPs assigned to genes in each pathway into LD blocks, and to sum the chi-square statistics of LD blocks for assessing the significance of the pathway by permutation tests. The method was proven robust with the type I error rate >1.6 times lower than other methods. Meanwhile, the method displays a higher power and is not biased by the pathway size. The applications to the GWAS summary statistics for schizophrenia and breast cancer indicate that the detected top pathways contain more genes close to associated SNPs than other methods. As a result, the method identified 17 and 12 significant pathways containing 20 and 21 novel associated genes, respectively for two diseases. The method is available online by http://sparks-lab.org/server/DGAT-path .

  4. Reliability of unstable periodic orbit based control strategies in biological systems.

    PubMed

    Mishra, Nagender; Hasse, Maria; Biswal, B; Singh, Harinder P

    2015-04-01

    Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry of the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.

  5. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    PubMed Central

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-01-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 − 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising. PMID:27767050

  6. A T1 and DTI fused 3D corpus callosum analysis in pre- vs. post-season contact sports players

    NASA Astrophysics Data System (ADS)

    Lao, Yi; Law, Meng; Shi, Jie; Gajawelli, Niharika; Haas, Lauren; Wang, Yalin; Leporé, Natasha

    2015-01-01

    Sports related traumatic brain injury (TBI) is a worldwide public health issue, and damage to the corpus callosum (CC) has been considered as an important indicator of TBI. However, contact sports players suffer repeated hits to the head during the course of a season even in the absence of diagnosed concussion, and less is known about their effect on callosal anatomy. In addition, T1-weighted and diffusion tensor brain magnetic resonance images (DTI) have been analyzed separately, but a joint analysis of both types of data may increase statistical power and give a more complete understanding of anatomical correlates of subclinical concussions in these athletes. Here, for the first time, we fuse T1 surface-based morphometry and a new DTI analysis on 3D surface representations of the CCs into a single statistical analysis on these subjects. Our new combined method successfully increases detection power in detecting differences between pre- vs. post-season contact sports players. Alterations are found in the ventral genu, isthmus, and splenium of CC. Our findings may inform future health assessments in contact sports players. The new method here is also the first truly multimodal diffusion and T1-weighted analysis of the CC, and may be useful to detect anatomical changes in the corpus callosum in other multimodal datasets.

  7. Application of Space Borne CO2 and Fluorescence Measurements to Detect Urban CO2 Emissions and Anthropogenic Influence on Vegetation

    NASA Astrophysics Data System (ADS)

    Paetzold, Johannes C.; Chen, Jia; Ruisinger, Veronika

    2017-04-01

    The Orbiting Carbon Observatory 2 (OCO-2) is a NASA satellite mission dedicated to make global, space-based observations of atmospheric, column-averaged carbon dioxide (XCO2). In addition, the OCO-2 also measures Solar Induced Chlorophyll Fluorescence (SIF). In our research we have studied the combination of OCO-2's XCO2 and SIF measurements for numerous urban areas on the different continents. Applying GIS and KML visualization techniques as well as statistical approaches we are able to reliably detect anthropogenic CO2 emissions in CO2 column concentration enhancements over urban areas. Moreover, we detect SIF decreases over urban areas compared to their rural vicinities. We are able to obtain those findings for urban areas on different continents, of diverse sizes, dissimilar topographies and urban constructions. Our statistical analysis finds robust XCO2 enhancements of up to 3 ppm for urban areas in Europe, Asia and North America. Furthermore, the analysis of SIF indicates that urban construction, population density and seasonality influence urban vegetation, which can be observed from space. Additionally, we find that OCO-2's SIF measurements have the potential to identify and approximate green areas within cities. For Berlin's Grunewald Forest as well as Mumbai's Sanjay Gandhi and Tungareshwar National Parks we observe enhancements in SIF measurements at sub-city scales.

  8. Reliability of unstable periodic orbit based control strategies in biological systems

    NASA Astrophysics Data System (ADS)

    Mishra, Nagender; Hasse, Maria; Biswal, B.; Singh, Harinder P.

    2015-04-01

    Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry of the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.

  9. A Practical Guide to Check the Consistency of Item Response Patterns in Clinical Research Through Person-Fit Statistics: Examples and a Computer Program.

    PubMed

    Meijer, Rob R; Niessen, A Susan M; Tendeiro, Jorge N

    2016-02-01

    Although there are many studies devoted to person-fit statistics to detect inconsistent item score patterns, most studies are difficult to understand for nonspecialists. The aim of this tutorial is to explain the principles of these statistics for researchers and clinicians who are interested in applying these statistics. In particular, we first explain how invalid test scores can be detected using person-fit statistics; second, we provide the reader practical examples of existing studies that used person-fit statistics to detect and to interpret inconsistent item score patterns; and third, we discuss a new R-package that can be used to identify and interpret inconsistent score patterns. © The Author(s) 2015.

  10. Genome-Wide Specific Selection in Three Domestic Sheep Breeds.

    PubMed

    Wang, Huihua; Zhang, Li; Cao, Jiaxve; Wu, Mingming; Ma, Xiaomeng; Liu, Zhen; Liu, Ruizao; Zhao, Fuping; Wei, Caihong; Du, Lixin

    2015-01-01

    Commercial sheep raised for mutton grow faster than traditional Chinese sheep breeds. Here, we aimed to evaluate genetic selection among three different types of sheep breed: two well-known commercial mutton breeds and one indigenous Chinese breed. We first combined locus-specific branch lengths and di statistical methods to detect candidate regions targeted by selection in the three different populations. The results showed that the genetic distances reached at least medium divergence for each pairwise combination. We found these two methods were highly correlated, and identified many growth-related candidate genes undergoing artificial selection. For production traits, APOBR and FTO are associated with body mass index. For meat traits, ALDOA, STK32B and FAM190A are related to marbling. For reproduction traits, CCNB2 and SLC8A3 affect oocyte development. We also found two well-known genes, GHR (which affects meat production and quality) and EDAR (associated with hair thickness) were associated with German mutton merino sheep. Furthermore, four genes (POL, RPL7, MSL1 and SHISA9) were associated with pre-weaning gain in our previous genome-wide association study. Our results indicated that combine locus-specific branch lengths and di statistical approaches can reduce the searching ranges for specific selection. And we got many credible candidate genes which not only confirm the results of previous reports, but also provide a suite of novel candidate genes in defined breeds to guide hybridization breeding.

  11. The assessment of data sources for influenza virologic surveillance in New York State.

    PubMed

    Escuyer, Kay L; Waters, Christine L; Gowie, Donna L; Maxted, Angie M; Farrell, Gregory M; Fuschino, Meghan E; St George, Kirsten

    2017-03-01

    Following the 2013 USA release of the Influenza Virologic Surveillance Right Size Roadmap, the New York State Department of Health (NYSDOH) embarked on an evaluation of data sources for influenza virologic surveillance. To assess NYS data sources, additional to data generated by the state public health laboratory (PHL), which could enhance influenza surveillance at the state and national level. Potential sources of laboratory test data for influenza were analyzed for quantity and quality. Computer models, designed to assess sample sizes and the confidence of data for statistical representation of influenza activity, were used to compare PHL test data to results from clinical and commercial laboratories, reported between June 8, 2013 and May 31, 2014. Sample sizes tested for influenza at the state PHL were sufficient for situational awareness surveillance with optimal confidence levels, only during peak weeks of the influenza season. Influenza data pooled from NYS PHLs and clinical laboratories generated optimal confidence levels for situational awareness throughout the influenza season. For novel influenza virus detection in NYS, combined real-time (rt) RT-PCR data from state and regional PHLs achieved ≥85% confidence during peak influenza activity, and ≥95% confidence for most of low season and all of off-season. In NYS, combined data from clinical, commercial, and public health laboratories generated optimal influenza surveillance for situational awareness throughout the season. Statistical confidence for novel virus detection, which is reliant on only PHL data, was achieved for most of the year. © 2016 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.

  12. Automatic brain tumor detection in MRI: methodology and statistical validation

    NASA Astrophysics Data System (ADS)

    Iftekharuddin, Khan M.; Islam, Mohammad A.; Shaik, Jahangheer; Parra, Carlos; Ogg, Robert

    2005-04-01

    Automated brain tumor segmentation and detection are immensely important in medical diagnostics because it provides information associated to anatomical structures as well as potential abnormal tissue necessary to delineate appropriate surgical planning. In this work, we propose a novel automated brain tumor segmentation technique based on multiresolution texture information that combines fractal Brownian motion (fBm) and wavelet multiresolution analysis. Our wavelet-fractal technique combines the excellent multiresolution localization property of wavelets to texture extraction of fractal. We prove the efficacy of our technique by successfully segmenting pediatric brain MR images (MRIs) from St. Jude Children"s Research Hospital. We use self-organizing map (SOM) as our clustering tool wherein we exploit both pixel intensity and multiresolution texture features to obtain segmented tumor. Our test results show that our technique successfully segments abnormal brain tissues in a set of T1 images. In the next step, we design a classifier using Feed-Forward (FF) neural network to statistically validate the presence of tumor in MRI using both the multiresolution texture and the pixel intensity features. We estimate the corresponding receiver operating curve (ROC) based on the findings of true positive fractions and false positive fractions estimated from our classifier at different threshold values. An ROC, which can be considered as a gold standard to prove the competence of a classifier, is obtained to ascertain the sensitivity and specificity of our classifier. We observe that at threshold 0.4 we achieve true positive value of 1.0 (100%) sacrificing only 0.16 (16%) false positive value for the set of 50 T1 MRI analyzed in this experiment.

  13. Imaging, object detection, and change detection with a polarized multistatic GPR array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beer, N. Reginald; Paglieroni, David W.

    A polarized detection system performs imaging, object detection, and change detection factoring in the orientation of an object relative to the orientation of transceivers. The polarized detection system may operate on one of several modes of operation based on whether the imaging, object detection, or change detection is performed separately for each transceiver orientation. In combined change mode, the polarized detection system performs imaging, object detection, and change detection separately for each transceiver orientation, and then combines changes across polarizations. In combined object mode, the polarized detection system performs imaging and object detection separately for each transceiver orientation, and thenmore » combines objects across polarizations and performs change detection on the result. In combined image mode, the polarized detection system performs imaging separately for each transceiver orientation, and then combines images across polarizations and performs object detection followed by change detection on the result.« less

  14. Testing statistical self-similarity in the topology of river networks

    USGS Publications Warehouse

    Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.

    2010-01-01

    Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.

  15. Statistical methods to estimate treatment effects from multichannel electroencephalography (EEG) data in clinical trials.

    PubMed

    Ma, Junshui; Wang, Shubing; Raubertas, Richard; Svetnik, Vladimir

    2010-07-15

    With the increasing popularity of using electroencephalography (EEG) to reveal the treatment effect in drug development clinical trials, the vast volume and complex nature of EEG data compose an intriguing, but challenging, topic. In this paper the statistical analysis methods recommended by the EEG community, along with methods frequently used in the published literature, are first reviewed. A straightforward adjustment of the existing methods to handle multichannel EEG data is then introduced. In addition, based on the spatial smoothness property of EEG data, a new category of statistical methods is proposed. The new methods use a linear combination of low-degree spherical harmonic (SPHARM) basis functions to represent a spatially smoothed version of the EEG data on the scalp, which is close to a sphere in shape. In total, seven statistical methods, including both the existing and the newly proposed methods, are applied to two clinical datasets to compare their power to detect a drug effect. Contrary to the EEG community's recommendation, our results suggest that (1) the nonparametric method does not outperform its parametric counterpart; and (2) including baseline data in the analysis does not always improve the statistical power. In addition, our results recommend that (3) simple paired statistical tests should be avoided due to their poor power; and (4) the proposed spatially smoothed methods perform better than their unsmoothed versions. Copyright 2010 Elsevier B.V. All rights reserved.

  16. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    USGS Publications Warehouse

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  17. Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.

    PubMed

    Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie

    2016-12-01

    An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.

  18. LROC Investigation of Three Strategies for Reducing the Impact of Respiratory Motion on the Detection of Solitary Pulmonary Nodules in SPECT

    NASA Astrophysics Data System (ADS)

    Smyczynski, Mark S.; Gifford, Howard C.; Dey, Joyoni; Lehovich, Andre; McNamara, Joseph E.; Segars, W. Paul; King, Michael A.

    2016-02-01

    The objective of this investigation was to determine the effectiveness of three motion reducing strategies in diminishing the degrading impact of respiratory motion on the detection of small solitary pulmonary nodules (SPNs) in single-photon emission computed tomographic (SPECT) imaging in comparison to a standard clinical acquisition and the ideal case of imaging in the absence of respiratory motion. To do this nonuniform rational B-spline cardiac-torso (NCAT) phantoms based on human-volunteer CT studies were generated spanning the respiratory cycle for a normal background distribution of Tc-99 m NeoTect. Similarly, spherical phantoms of 1.0-cm diameter were generated to model small SPN for each of the 150 uniquely located sites within the lungs whose respiratory motion was based on the motion of normal structures in the volunteer CT studies. The SIMIND Monte Carlo program was used to produce SPECT projection data from these. Normal and single-lesion containing SPECT projection sets with a clinically realistic Poisson noise level were created for the cases of 1) the end-expiration (EE) frame with all counts, 2) respiration-averaged motion with all counts, 3) one fourth of the 32 frames centered around EE (Quarter Binning), 4) one half of the 32 frames centered around EE (Half Binning), and 5) eight temporally binned frames spanning the respiratory cycle. Each of the sets of combined projection data were reconstructed with RBI-EM with system spatial-resolution compensation (RC). Based on the known motion for each of the 150 different lesions, the reconstructed volumes of respiratory bins were shifted so as to superimpose the locations of the SPN onto that in the first bin (Reconstruct and Shift). Five human observers performed localization receiver operating characteristics (LROC) studies of SPN detection. The observer results were analyzed for statistical significance differences in SPN detection accuracy among the three correction strategies, the standard acquisition, and the ideal case of the absence of respiratory motion. Our human-observer LROC determined that Quarter Binning and Half Binning strategies resulted in SPN detection accuracy statistically significantly below ( ) that of standard clinical acquisition, whereas the Reconstruct and Shift strategy resulted in a detection accuracy not statistically significantly different from that of the ideal case. This investigation demonstrates that tumor detection based on acquisitions associated with less than all the counts which could potentially be employed may result in poorer detection despite limiting the motion of the lesion. The Reconstruct and Shift method results in tumor detection that is equivalent to ideal motion correction.

  19. Artificial neural networks in gynaecological diseases: current and potential future applications.

    PubMed

    Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios

    2010-10-01

    Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.

  20. Spiral CT scanning technique in the detection of aspiration of LEGO foreign bodies.

    PubMed

    Applegate, K E; Dardinger, J T; Lieber, M L; Herts, B R; Davros, W J; Obuchowski, N A; Maneker, A

    2001-12-01

    Radiolucent foreign bodies (FBs) such as plastic objects and toys remain difficult to identify on conventional radiographs of the neck and chest. Children may present with a variety of respiratory complaints, which may or may not be due to a FB. To determine whether radiolucent FBs such as plastic LEGOs and peanuts can be seen in the tracheobronchial tree or esophagus using low-dose spiral CT, and, if visible, to determine the optimal CT imaging technique. Multiple spiral sequences were performed while varying the CT parameters and the presence and location of FBs in either the trachea or the esophagus first on a neck phantom and then a cadaver. Sequences were rated by three radiologists blinded to the presence of a FB using a single scoring system. The LEGO was well visualized in the trachea by all three readers (both lung and soft-tissue windowing: combined sensitivity 89 %, combined specificity 89 %) and to a lesser extent in the esophagus (combined sensitivity 31 %, combined specificity 100 %). The peanut was not well visualized (combined sensitivity < 35 %). The optimal technique for visualizing the LEGO was 120 kV, 90 mA, 3-mm collimation, 0.75 s/revolution, and 2.0 pitch. This allowed for coverage of the cadaver tracheobronchial tree (approximately 11 cm) in about 18 s. Although statistical power was low for detecting significant differences, all three readers noted higher average confidence ratings with lung windowing among 18 LEGO-in-trachea scans. Rapid, low-dose spiral CT may be used to visualize LEGO FBs in the airway or esophagus. Peanuts were not well visualized.

  1. Comparison of two passive warming devices for prevention of perioperative hypothermia in dogs.

    PubMed

    Potter, J; Murrell, J; MacFarlane, P

    2015-09-01

    To compare effects of two passive warming methods combined with a resistive heating mat on perioperative hypothermia in dogs. Fifty-two dogs were enrolled and randomly allocated to receive a reflective blanket (Blizzard Blanket) or a fabric blanket (VetBed). In addition, in the operating room all dogs were placed onto a table with a resistive heating mat covered with a fabric blanket. Rectal temperature measurements were taken at defined points. Statistical analysis was performed comparing all Blizzard Blanket-treated to all VetBed-treated dogs, and VetBed versus Blizzard Blanket dogs within spay and castrate groups, spay versus castrate groups and within groups less than 10 kg or more than 10 kg bodyweight. Data from 39 dogs were used for analysis. All dogs showed a reduction in perioperative rectal temperature. There were no detected statistical differences between treatments or between the different groups. This study supports previous data on prevalence of hypothermia during surgery. The combination of active and passive warming methods used in this study prevented the development of severe hypothermia, but there were no differences between treatment groups. © 2015 British Small Animal Veterinary Association.

  2. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  3. Ensemble stacking mitigates biases in inference of synaptic connectivity.

    PubMed

    Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N

    2018-01-01

    A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.

  4. Indirect Estimation of Radioactivity in Containerized Cargo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Scherrer, Chad; Smith, Eric L.

    Detecting illicit nuclear and radiological material in containerized cargo challenges the state of the art in detection systems. Current systems are being evaluated and new systems envisioned to address the need for the high probability of detection and extremely low false alarm rates necessary to thwart potential threats and extremely low nuisance and false alarm rates while maintaining necessary to maintain the flow of commerce impacted by the enormous volume of commodities imported in shipping containers. Maintaining flow of commerce also means that primary inspection must be rapid, requiring relatively indirect measurements of cargo from outside the containers. With increasingmore » information content in such indirect measurements, it is natural to ask how the information might be combined to improved detection. Toward this end, we present an approach to estimating isotopic activity of naturally occurring radioactive material in cargo grouped by commodity type, combining container manifest data with radiography and gamma spectroscopy aligned to location along the container. The heart of this approach is our statistical model of gamma counts within peak regions of interest, which captures the effects of background suppression, counting noise, convolution of neighboring cargo contributions, and down-scattered photons to provide physically constrained estimates of counts due to decay of specific radioisotopes in cargo alone. Coupled to that model, we use a mechanistic model of self-attenuated radiation flux to estimate the isotopic activity within cargo, segmented by location within each container, that produces those counts. We demonstrate our approach by applying it to a set of measurements taken at the Port of Seattle in 2006. This approach to synthesizing disparate available data streams and extraction of cargo characteristics holds the potential to improve primary inspection using current detection capabilities and to enable simulation-based evaluation of new candidate detection systems.« less

  5. The value of subtraction MRI in detection of amyloid-related imaging abnormalities with oedema or effusion in Alzheimer's patients: An interobserver study.

    PubMed

    Martens, Roland M; Bechten, Arianne; Ingala, Silvia; van Schijndel, Ronald A; Machado, Vania B; de Jong, Marcus C; Sanchez, Esther; Purcell, Derk; Arrighi, Michael H; Brashear, Robert H; Wattjes, Mike P; Barkhof, Frederik

    2018-03-01

    Immunotherapeutic treatments targeting amyloid-β plaques in Alzheimer's disease (AD) are associated with the presence of amyloid-related imaging abnormalities with oedema or effusion (ARIA-E), whose detection and classification is crucial to evaluate subjects enrolled in clinical trials. To investigate the applicability of subtraction MRI in the ARIA-E detection using an established ARIA-E-rating scale. We included 75 AD patients receiving bapineuzumab treatment, including 29 ARIA-E cases. Five neuroradiologists rated their brain MRI-scans with and without subtraction images. The accuracy of evaluating the presence of ARIA-E, intraclass correlation coefficient (ICC) and specific agreement was calculated. Subtraction resulted in higher sensitivity (0.966) and lower specificity (0.970) than native images (0.959, 0.991, respectively). Individual rater detection was excellent. ICC scores ranged from excellent to good, except for gyral swelling (moderate). Excellent negative and good positive specific agreement among all ARIA-E imaging features was reported in both groups. Combining sulcal hyperintensity and gyral swelling significantly increased positive agreement for subtraction images. Subtraction MRI has potential as a visual aid increasing the sensitivity of ARIA-E assessment. However, in order to improve its usefulness isotropic acquisition and enhanced training are required. The ARIA-E rating scale may benefit from combining sulcal hyperintensity and swelling. • Subtraction technique can improve detection amyloid-related imaging-abnormalities with edema/effusion in Alzheimer's patients. • The value of ARIA-E detection, classification and monitoring using subtraction was assessed. • Validation of an established ARIA-E rating scale, recommendations for improvement are reported. • Complementary statistical methods were employed to measure accuracy, inter-rater-reliability and specific agreement.

  6. Effects of liquid chromatography mobile phases and buffer salts on phosphorus inductively coupled plasma atomic emission and mass spectrometries utilizing ultrasonic nebulization and membrane desolvation.

    PubMed

    Carr, John E; Kwok, Kaho; Webster, Gregory K; Carnahan, Jon W

    2006-01-23

    Atomic spectrometry, specifically inductively coupled plasma atomic emission spectrometry (ICP-AES) and mass spectrometry (ICP-MS) show promise for heteroatom-based detection of pharmaceutical compounds. The combination of ultrasonic nebulization (USN) with membrane desolvation (MD) greatly enhances detection limits with these approaches. Because pharmaceutical analyses often incorporate liquid chromatography, the study herein was performed to examine the effects of solvent composition on the analytical behaviors of these approaches. The target analyte was phosphorus, introduced as phosphomycin. AES response was examined at the 253.7 nm atom line and mass 31 ions were monitored for the MS experiments. With pure aqueous solutions, detection limits of 5 ppb (0.5 ng in 0.1 mL injection volumes) were obtained with ICP-MS. The ICP-AES detection limit was 150 ppb. Solvent compositions were varied from 0 to 80% organic (acetonitrile and methanol) with nine buffers at concentrations typically used in liquid chromatography. In general, solvents and buffers had statistically significant, albeit small, effects on ICP-AES sensitivities. A few exceptions occurred in cases where typical liquid chromatography buffer concentrations produced higher mass loadings on the plasma. Indications are that isocratic separations can be reliably performed. Within reasonable accuracy tolerances, it appears that gradient chromatography can be performed without the need for signal response normalization. Organic solvent and buffer effects were more significant with ICP-MS. Sensitivities varied significantly with different buffers and organic solvent content. In these cases, gradient chromatography will require careful analytical calibration as solvent and buffer content is varied. However, for most buffer and solvent combinations, signal and detection limits are only moderately affected. Isocratic separations and detection are feasible.

  7. Comparative hygienic assessment of active ingredients content in the air environment after treatment of cereal spiked crops by combined fungicides.

    PubMed

    Kondratiuk, Mykola; Blagaia, Anna; Pelo, Ihor

    2018-01-01

    Introduction: The quality of the air environment significantly affects the health of the population. Chemical plant protection products in the spring and summer time may be the main pollutants of the air environment in rural areas. Chemical plant protection products are dangerous substances of anthropogenic origin. If applying pesticides in high concentrations, the risk of poisoning by active ingredients of pesticide preparations in workers directly contacting with it increases. The aim: Comparative hygienic assessment of active ingredients content in the air environment after treatment of cereal spiked crops by combined fungicides was the aim of the work. Materials and methods: Active ingredients of the studied combined fungicides, samples of air, and swabs from workers' skin and stripes from overalls were materials of the research. Methods of full-scale in-field hygienic experiment, gas-liquid chromatography, high-performance liquid chromatography, as well as statistical and bibliographic methods were used in the research. Results and conclusions: Active ingredients of the studied combined fungicides were not detected in the working zone air and atmospheric air at the levels exceeding the limits of its detection by appropriate chromatography methods. Findings confirmed the air environment safety for agricultural workers and rural population if studied combined fungicides are applied following the hygienically approved suggested application rates and in accordance of good agricultural practice rules. However the possible complex risk for workers after certain studied fungicides application may be higher than acceptable due to the elevated values for dermal effects. The complex risk was higher than acceptable in еру case of aerial spraying of both studied fungicides, meanwhile only one combination of active ingredients revealed possible risk for workers applying fungicides by rod method of cereal spiked crops treatment.

  8. Detection of Test Collusion via Kullback-Leibler Divergence

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2013-01-01

    The development of statistical methods for detecting test collusion is a new research direction in the area of test security. Test collusion may be described as large-scale sharing of test materials, including answers to test items. Current methods of detecting test collusion are based on statistics also used in answer-copying detection.…

  9. Statistical lamb wave localization based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  10. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  11. Explanatory Supplement to the AllWISE Data Release Products

    NASA Astrophysics Data System (ADS)

    Cutri, R. M.; Wright, E. L.; Conrow, T.; Fowler, J. W.; Eisenhardt, P. R. M.; Grillmair, C.; Kirkpatrick, J. D.; Masci, F.; McCallon, H. L.; Wheelock, S. L.; Fajardo-Acosta, S.; Yan, L.; Benford, D.; Harbut, M.; Jarrett, T.; Lake, S.; Leisawitz, D.; Ressler, M. E.; Stanford, S. A.; Tsai, C. W.; Liu, F.; Helou, G.; Mainzer, A.; Gettings, D.; Gonzalez, A.; Hoffman, D.; Marsh, K. A.; Padgett, D.; Skrutskie, M. F.; Beck, R. P.; Papin, M.; Wittman, M.

    2013-11-01

    The AllWISE program builds upon the successful Wide-field Infrared Survey Explorer (WISE; Wright et al. 2010) mission by combining data from all WISE and NEOWISE (Mainzer et al. 2011) survey phases to form the most comprehensive view of the mid-infrared sky currently available. By combining the data from two complete sky coverage epochs in an advanced data processing system, AllWISE has generated new products that have enhanced photometric sensitivity and accuracy, and improved astrometric precision compared with the earlier WISE All-Sky Data Release. Exploiting the 6 month baseline between the WISE sky coverage epochs enables AllWISE to measure source motions for the first time, and to compute improved flux variability statistics. AllWISE data release products include: a Source Catalog that contains 4-band fluxes, positions, apparent motion measurements, and flux variability statistics for over 747 million objects detected at SNR>5 in the combined exposures; a Multiepoch Photometry Database containing over 42 billion time-tagged, single-exposure fluxes for each object detected on the combined exposures; and an Image Atlas of 18,240 4-band calibrated FITS images, depth-of-coverage and noise maps that cover the sky produced by coadding nearly 7.9 million single-exposure images from the cryogenic and post-cryogenic survey phases. The Explanatory Supplement to the AllWISE Data Release Products is a general guide for users of the AllWISE data. The Supplement contains detailed descriptions of the format and characteristics of the AllWISE data products, as well as a summary of cautionary notes that describe known limitations. The Supplement is an on-line document that is updated frequently to provide the most current information for users of the AllWISE data products. The Explanatory Supplement is maintained at: http://wise2.ipac.caltech.edu/docs/release/allwise/expsup/index.html AllWISE makes use of data from WISE, which is a joint project of the University of California, Los Angeles, and the Jet Propulsion Laboratory/California Institute of Technology, and NEOWISE, which is a project of the Jet Propulsion Laboratory/California Institute of Technology. WISE and NEOWISE are funded by the National Aeronautics and Space Administration.

  12. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  13. Object recognition with hierarchical discriminant saliency networks.

    PubMed

    Han, Sunhyoung; Vasconcelos, Nuno

    2014-01-01

    The benefits of integrating attention and object recognition are investigated. While attention is frequently modeled as a pre-processor for recognition, we investigate the hypothesis that attention is an intrinsic component of recognition and vice-versa. This hypothesis is tested with a recognition model, the hierarchical discriminant saliency network (HDSN), whose layers are top-down saliency detectors, tuned for a visual class according to the principles of discriminant saliency. As a model of neural computation, the HDSN has two possible implementations. In a biologically plausible implementation, all layers comply with the standard neurophysiological model of visual cortex, with sub-layers of simple and complex units that implement a combination of filtering, divisive normalization, pooling, and non-linearities. In a convolutional neural network implementation, all layers are convolutional and implement a combination of filtering, rectification, and pooling. The rectification is performed with a parametric extension of the now popular rectified linear units (ReLUs), whose parameters can be tuned for the detection of target object classes. This enables a number of functional enhancements over neural network models that lack a connection to saliency, including optimal feature denoising mechanisms for recognition, modulation of saliency responses by the discriminant power of the underlying features, and the ability to detect both feature presence and absence. In either implementation, each layer has a precise statistical interpretation, and all parameters are tuned by statistical learning. Each saliency detection layer learns more discriminant saliency templates than its predecessors and higher layers have larger pooling fields. This enables the HDSN to simultaneously achieve high selectivity to target object classes and invariance. The performance of the network in saliency and object recognition tasks is compared to those of models from the biological and computer vision literatures. This demonstrates benefits for all the functional enhancements of the HDSN, the class tuning inherent to discriminant saliency, and saliency layers based on templates of increasing target selectivity and invariance. Altogether, these experiments suggest that there are non-trivial benefits in integrating attention and recognition.

  14. Combining the Power of Statistical Analyses and Community Interviews to Identify Adoption Barriers for Stormwater Best-Management Practices

    NASA Astrophysics Data System (ADS)

    Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.

    2015-12-01

    Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.

  15. Effects of hydrogeological and anthropogenic factors on the distribution of CVOCs in eogenetic karst aquifers

    NASA Astrophysics Data System (ADS)

    Torres Torres, N. I.; Padilla, I. Y.; Rivera, V. L.

    2016-12-01

    Eogenetic kart aquifers are characterized by well-developed conduit networks within a rock matrix having significant primary porosity and permeability. These aquifers are highly productive and serve as important source of water for multiple uses. As a consequence, eogenetic karst regions are attractive for industrial, urban, and agricultural development that can serve as contaminations sources for the aquifers. It is hypothesized that the distribution of contaminants in these aquifers are influenced by combined characteristics of source and hydrogeological features. This research assesses the spatio-temporal distribution of chlorinated volatile organic compounds (CVOCs) in the eogenetic karst aquifers of northern Puerto Rico (NPR) and studies the correlation between hydrogeological and anthropogenic variables and groundwater contamination using Geographic Information System and statistical methods. CVOCs, which are used as dry cleaning and industrial solvents, degreasers and paint or spot removers, are among the most commonly found groundwater contaminants in the world. The NPR karst aquifers have been heavily impacted by land development and groundwater contamination, particularly CVOCs, with Trichloroethylene, Tetrachloroethylene, and Carbon Tetrachloride among the most detected contaminants. The analysis shows that 62% of the samples and 78% of the sites sampled have presence of one or more CVOC, and that their concentrations vary with time. Detection and concentrations of certain CVOCs are associated with some sources of known contamination. Significant presence of CVOCs is also found near developed and agricultural land uses. The shallow aquifer shows greater presence of CVOCs (66%) than the confined aquifer (16%), with most detections occurring in areas of low and medium sinkholes coverage and medium hydraulic conductivities. Multivariate statistical analysis indicates that, indeed, the distribution of CVOCs in the karsts aquifers of NPR is influenced by a combination of contaminant source and hydrogeological factors. These factors, which facilitate the entry of contaminants into the system, and promote their transport and storage, have resulted in extensive spatial and temporal contamination of eogenetic karst groundwater systems, such as those found in northern Puerto Rico.

  16. Detection of early pancreatic ductal adenocarcinoma with thrombospondin-2 and CA19-9 blood markers.

    PubMed

    Kim, Jungsun; Bamlet, William R; Oberg, Ann L; Chaffee, Kari G; Donahue, Greg; Cao, Xing-Jun; Chari, Suresh; Garcia, Benjamin A; Petersen, Gloria M; Zaret, Kenneth S

    2017-07-12

    Markers are needed to facilitate early detection of pancreatic ductal adenocarcinoma (PDAC), which is often diagnosed too late for effective therapy. Starting with a PDAC cell reprogramming model that recapitulated the progression of human PDAC, we identified secreted proteins and tested a subset as potential markers of PDAC. We optimized an enzyme-linked immunosorbent assay (ELISA) using plasma samples from patients with various stages of PDAC, from individuals with benign pancreatic disease, and from healthy controls. A phase 1 discovery study ( n = 20), a phase 2a validation study ( n = 189), and a second phase 2b validation study ( n = 537) revealed that concentrations of plasma thrombospondin-2 (THBS2) discriminated among all stages of PDAC consistently. The receiver operating characteristic (ROC) c-statistic was 0.76 in the phase 1 study, 0.84 in the phase 2a study, and 0.87 in the phase 2b study. The plasma concentration of THBS2 was able to discriminate resectable stage I cancer as readily as stage III/IV PDAC tumors. THBS2 plasma concentrations combined with those for CA19-9, a previously identified PDAC marker, yielded a c-statistic of 0.96 in the phase 2a study and 0.97 in the phase 2b study. THBS2 data improved the ability of CA19-9 to distinguish PDAC from pancreatitis. With a specificity of 98%, the combination of THBS2 and CA19-9 yielded a sensitivity of 87% for PDAC in the phase 2b study. A THBS2 and CA19-9 blood marker panel measured with a conventional ELISA may improve the detection of patients at high risk for PDAC. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  17. Detection of questionable occlusal carious lesions using an electrical bioimpedance method with fractional electrical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morais, A. P.; Salgado de Oliveira University, Marechal Deodoro Street, 217 – Centro, Niterói, Rio de Janeiro; Pino, A. V.

    This in vitro study evaluated the diagnostic performance of an alternative electric bioimpedance spectroscopy technique (BIS-STEP) detect questionable occlusal carious lesions. Six specialists carried out the visual (V), radiography (R), and combined (VR) exams of 57 sound or non-cavitated occlusal carious lesion teeth classifying the occlusal surfaces in sound surface (H), enamel caries (EC), and dentinal caries (DC). Measurements were based on the current response to a step voltage excitation (BIS-STEP). A fractional electrical model was used to predict the current response in the time domain and to estimate the model parameters: Rs and Rp (resistive parameters), and C andmore » α (fractional parameters). Histological analysis showed caries prevalence of 33.3% being 15.8% hidden caries. Combined examination obtained the best traditional diagnostic results with specificity = 59.0%, sensitivity = 70.9%, and accuracy = 60.8%. There were statistically significant differences in bioimpedance parameters between the H and EC groups (p = 0.016) and between the H and DC groups (Rs, p = 0.006; Rp, p = 0.022, and α, p = 0.041). Using a suitable threshold for the Rs, we obtained specificity = 60.7%, sensitivity = 77.9%, accuracy = 73.2%, and 100% of detection for deep lesions. It can be concluded that BIS-STEP method could be an important tool to improve the detection and management of occlusal non-cavitated primary caries and pigmented sites.« less

  18. The Application of a Technique for Vector Correlation to Problems in Meteorology and Oceanography.

    NASA Astrophysics Data System (ADS)

    Breaker, L. C.; Gemmill, W. H.; Crosby, D. S.

    1994-11-01

    In a recent study, Crosby et al. proposed a definition for vector correlation that has not been commonly used in meteorology or oceanography. This definition has both a firm theoretical basis and a rather complete set of desirable statistical properties. In this study, the authors apply the definition to practical problems arising in meteorology and oceanography. In the first of two case studies, vector correlations were calculated between subsurface currents for five locations along the southeastern shore of Lake Erie. Vector correlations for one sample size were calculated for all current meter combinations, first including the seiche frequency and then with the seiche frequency removed. Removal of the seiche frequency, which was easily detected in the current spectra, had only a small effect on the vector correlations. Under reasonable assumptions, the vector correlations were in most cases statistically significant and revealed considerable fine structure in the vector correlation sequences. In some cases, major variations in vector correlation coincided with changes in surface wind. The vector correlations for the various current meter combinations decreased rapidly with increasing spatial separation. For one current meter combination, canonical correlations were also calculated; the first canonical correlation tended to retain the underlying trend, whereas the second canonical correlation retained the peaks in the vector correlations.In the second case study, vector correlations were calculated between marine surface winds derived from the National Meteorological Center's Global Data Assimilation System and observed winds acquired from the network of National Data Buoy Center buoys that are located off the continental United States and in the Gulf of Alaska. Results of this comparison indicated that 1) there was a significant decrease in correlation between the predicted and observed winds with increasing forecast interval out to 72 h, 2) the technique provides a sensitive indicator for detecting bad buoy reports, and 3) there was no obvious seasonal cycle in the monthly vector correlations for the period of observation.

  19. Identification of in vitro cytochrome P450 modulators to detect induction by prototype inducers in the mallard duckling (Anas platyrhynchos

    USGS Publications Warehouse

    Renauld, A.E.; Melancon, M.J.; Sordillo, L.M.

    1999-01-01

    Seven modulators of mammalian monooxygenase activity were screened for their ability to selectively stimulate or inhibit in vitro monooxygenase activities of hepatic microsomes from mallard ducklings treated with phenobarbital, β-naphthoflavone, 3,3′,4,4′,5-pentachlorobiphenyl or vehicle. Microsomes were assayed fluorometrically for four monooxygenases: benzyloxy-, ethoxy-, methoxy-, and pentoxyresorufin-O-dealkylase, in combination with each of the seven modulators. Four combinations: α-naphthoflavone and 2-methylbenzimidazole with benzyloxyresorufin, and Proadifen with methoxy- and ethoxyresorufin, respectively, were evaluated further. β-Naphthoflavone-treated groups were clearly distinguished from the corn oil vehicle control group by all of the assays and by the effects of the modulators in three of the four assay/modulator combinations. Enzyme activities of the phenobarbital and saline groups were statistically similar (P≥0.05) when assayed without modulator added, but each assay/modulator combination distinguished between these groups. The PCB-treated group was distinguished from the corn oil vehicle control group only for BROD activity, with or without the presence of modulator. Graphing of per cent modulation of BROD activity versus initial BROD activity provided the clearest distinction between all of the study groups. Identification of these selective in vitro modulators may improve detection and measurement of low level cytochrome P450 induction in avian species. Also, both the monooxygenase activities induced and the impacts of the modulators indicated differences between mammalian and avian cytochromes P450.

  20. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1986-01-01

    The reliability of microfocous X-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 precent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  1. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1985-01-01

    The reliability of microfocus x-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 percent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  2. Insect-gene-activity detection system for chemical and biological warfare agents and toxic industrial chemicals

    NASA Astrophysics Data System (ADS)

    Mackie, Ryan S.; Schilling, Amanda S.; Lopez, Arturo M.; Rayms-Keller, Alfredo

    2002-02-01

    Detection of multiple chemical and biological weapons (CBW) agents and/or complex mixtures of toxic industrial chemicals (TIC) is imperative for both the commercial and military sectors. In a military scenario, a multi-CBW attack would create confusion, thereby delaying decontamination and therapeutic efforts. In the commercial sector, polluted sites invariably contain a mixture of TIC. Novel detection systems capable of detecting CBW and TIC are sorely needed. While it may be impossible to build a detector capable of discriminating all the possible combinations of CBW, a detection system capable of statistically predicting the most likely composition of a given mixture is within the reach of current emerging technologies. Aquatic insect-gene activity may prove to be a sensitive, discriminating, and elegant paradigm for the detection of CBW and TIC. We propose to systematically establish the expression patterns of selected protein markers in insects exposed to specific mixtures of chemical and biological warfare agents to generate a library of biosignatures of exposure. The predicting capabilities of an operational library of biosignatures of exposures will allow the detection of emerging novel or genetically engineered agents, as well as complex mixtures of chemical and biological weapons agents. CBW and TIC are discussed in the context of war, terrorism, and pollution.

  3. Structural damage detection in wind turbine blades based on time series representations of dynamic responses

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2015-03-01

    The development of large wind turbines that enable to harvest energy more efficiently is a consequence of the increasing demand for renewables in the world. To optimize the potential energy output, light and flexible wind turbine blades (WTBs) are designed. However, the higher flexibilities and lower buckling capacities adversely affect the long-term safety and reliability of WTBs, and thus the increased operation and maintenance costs reduce the expected revenue. Effective structural health monitoring techniques can help to counteract this by limiting inspection efforts and avoiding unplanned maintenance actions. Vibration-based methods deserve high attention due to the moderate instrumentation efforts and the applicability for in-service measurements. The present paper proposes the use of cross-correlations (CCs) of acceleration responses between sensors at different locations for structural damage detection in WTBs. CCs were in the past successfully applied for damage detection in numerical and experimental beam structures while utilizing only single lags between the signals. The present approach uses vectors of CC coefficients for multiple lags between measurements of two selected sensors taken from multiple possible combinations of sensors. To reduce the dimensionality of the damage sensitive feature (DSF) vectors, principal component analysis is performed. The optimal number of principal components (PCs) is chosen with respect to a statistical threshold. Finally, the detection phase uses the selected PCs of the healthy structure to calculate scores from a current DSF vector, where statistical hypothesis testing is performed for making a decision about the current structural state. The method is applied to laboratory experiments conducted on a small WTB with non-destructive damage scenarios.

  4. Defining syndromes using cattle meat inspection data for syndromic surveillance purposes: a statistical approach with the 2005-2010 data from ten French slaughterhouses.

    PubMed

    Dupuy, Céline; Morignat, Eric; Maugey, Xavier; Vinard, Jean-Luc; Hendrikx, Pascal; Ducrot, Christian; Calavas, Didier; Gay, Emilie

    2013-04-30

    The slaughterhouse is a central processing point for food animals and thus a source of both demographic data (age, breed, sex) and health-related data (reason for condemnation and condemned portions) that are not available through other sources. Using these data for syndromic surveillance is therefore tempting. However many possible reasons for condemnation and condemned portions exist, making the definition of relevant syndromes challenging.The objective of this study was to determine a typology of cattle with at least one portion of the carcass condemned in order to define syndromes. Multiple factor analysis (MFA) in combination with clustering methods was performed using both health-related data and demographic data. Analyses were performed on 381,186 cattle with at least one portion of the carcass condemned among the 1,937,917 cattle slaughtered in ten French abattoirs. Results of the MFA and clustering methods led to 12 clusters considered as stable according to year of slaughter and slaughterhouse. One cluster was specific to a disease of public health importance (cysticercosis). Two clusters were linked to the slaughtering process (fecal contamination of heart or lungs and deterioration lesions). Two clusters respectively characterized by chronic liver lesions and chronic peritonitis could be linked to diseases of economic importance to farmers. Three clusters could be linked respectively to reticulo-pericarditis, fatty liver syndrome and farmer's lung syndrome, which are related to both diseases of economic importance to farmers and herd management issues. Three clusters respectively characterized by arthritis, myopathy and Dark Firm Dry (DFD) meat could notably be linked to animal welfare issues. Finally, one cluster, characterized by bronchopneumonia, could be linked to both animal health and herd management issues. The statistical approach of combining multiple factor analysis with cluster analysis showed its relevance for the detection of syndromes using available large and complex slaughterhouse data. The advantages of this statistical approach are to i) define groups of reasons for condemnation based on meat inspection data, ii) help grouping reasons for condemnation among a list of various possible reasons for condemnation for which a consensus among experts could be difficult to reach, iii) assign each animal to a single syndrome which allows the detection of changes in trends of syndromes to detect unusual patterns in known diseases and emergence of new diseases.

  5. Application of Scan Statistics to Detect Suicide Clusters in Australia

    PubMed Central

    Cheung, Yee Tak Derek; Spittal, Matthew J.; Williamson, Michelle Kate; Tung, Sui Jay; Pirkis, Jane

    2013-01-01

    Background Suicide clustering occurs when multiple suicide incidents take place in a small area or/and within a short period of time. In spite of the multi-national research attention and particular efforts in preparing guidelines for tackling suicide clusters, the broader picture of epidemiology of suicide clustering remains unclear. This study aimed to develop techniques in using scan statistics to detect clusters, with the detection of suicide clusters in Australia as example. Methods and Findings Scan statistics was applied to detect clusters among suicides occurring between 2004 and 2008. Manipulation of parameter settings and change of area for scan statistics were performed to remedy shortcomings in existing methods. In total, 243 suicides out of 10,176 (2.4%) were identified as belonging to 15 suicide clusters. These clusters were mainly located in the Northern Territory, the northern part of Western Australia, and the northern part of Queensland. Among the 15 clusters, 4 (26.7%) were detected by both national and state cluster detections, 8 (53.3%) were only detected by the state cluster detection, and 3 (20%) were only detected by the national cluster detection. Conclusions These findings illustrate that the majority of spatial-temporal clusters of suicide were located in the inland northern areas, with socio-economic deprivation and higher proportions of indigenous people. Discrepancies between national and state/territory cluster detection by scan statistics were due to the contrast of the underlying suicide rates across states/territories. Performing both small-area and large-area analyses, and applying multiple parameter settings may yield the maximum benefits for exploring clusters. PMID:23342098

  6. Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks

    DTIC Science & Technology

    2016-04-26

    REPORT TYPE Final 3. DATES COVERED (From - To) 15 Oct 2014 to 14 Jan 2015 4. TITLE AND SUBTITLE Detecting statistically significant clusters of...extend the work of Perry et al. [6] by developing a statistical framework that supports the detection of triangle motif-based clusters in complex...priori, the need for triangle motif-based clustering . 2. Developed an algorithm for clustering undirected networks, where the triangle con guration was

  7. Arc detection for the ICRF system on ITER

    NASA Astrophysics Data System (ADS)

    D'Inca, R.

    2011-12-01

    The ICRF system for ITER is designed to respect the high voltage breakdown limits. However arcs can still statistically happen and must be quickly detected and suppressed by shutting the RF power down. For the conception of a reliable and efficient detector, the analysis of the mechanism of arcs is necessary to find their unique signature. Numerous systems have been conceived to address the issues of arc detection. VSWR-based detectors, RF noise detectors, sound detectors, optical detectors, S-matrix based detectors. Until now, none of them has succeeded in demonstrating the fulfillment of all requirements and the studies for ITER now follow three directions: improvement of the existing concepts to fix their flaws, development of new theoretically fully compliant detectors (like the GUIDAR) and combination of several detectors to benefit from the advantages of each of them. Together with the physical and engineering challenges, the development of an arc detection system for ITER raises methodological concerns to extrapolate the results from basic experiments and present machines to the ITER scale ICRF system and to conduct a relevant risk analysis.

  8. Parameter-space metric of semicoherent searches for continuous gravitational waves

    NASA Astrophysics Data System (ADS)

    Pletsch, Holger J.

    2010-08-01

    Continuous gravitational-wave (CW) signals such as emitted by spinning neutron stars are an important target class for current detectors. However, the enormous computational demand prohibits fully coherent broadband all-sky searches for prior unknown CW sources over wide ranges of parameter space and for yearlong observation times. More efficient hierarchical “semicoherent” search strategies divide the data into segments much shorter than one year, which are analyzed coherently; then detection statistics from different segments are combined incoherently. To optimally perform the incoherent combination, understanding of the underlying parameter-space structure is requisite. This problem is addressed here by using new coordinates on the parameter space, which yield the first analytical parameter-space metric for the incoherent combination step. This semicoherent metric applies to broadband all-sky surveys (also embedding directed searches at fixed sky position) for isolated CW sources. Furthermore, the additional metric resolution attained through the combination of segments is studied. From the search parameters (sky position, frequency, and frequency derivatives), solely the metric resolution in the frequency derivatives is found to significantly increase with the number of segments.

  9. Revisiting the reported signal of acute pancreatitis with rasburicase: an object lesson in pharmacovigilance

    PubMed Central

    Hauben, Manfred; Hung, Eric Y.

    2016-01-01

    Introduction: There is an interest in methodologies to expeditiously detect credible signals of drug-induced pancreatitis. An example is the reported signal of pancreatitis with rasburicase emerging from a study [the ‘index publication’ (IP)] combining quantitative signal detection findings from a spontaneous reporting system (SRS) and electronic health records (EHRs). The signal was reportedly supported by a clinical review with a case series manuscript in progress. The reported signal is noteworthy, being initially classified as a false-positive finding for the chosen reference standard, but reclassified as a ‘clinically supported’ signal. Objective: This paper has dual objectives: to revisit the signal of rasburicase and acute pancreatitis and extend the original analysis via reexamination of its findings, in light of more contemporary data; and to motivate discussions on key issues in signal detection and evaluation, including recent findings from a major international pharmacovigilance research initiative. Methodology: We used the same methodology as the IP, including the same disproportionality analysis software/dataset for calculating observed to expected reporting frequencies (O/Es), Medical Dictionary for Regulatory Activities Preferred Term, and O/E metric/threshold combination defining a signal of disproportionate reporting. Baseline analysis results prompted supplementary analyses using alternative analytical choices. We performed a comprehensive literature search to identify additional published case reports of rasburicase and pancreatitis. Results: We could not replicate positive findings (e.g. a signal or statistic of disproportionate reporting) from the SRS data using the same algorithm, software, dataset and vendor specified in the IP. The reporting association was statistically highlighted in default and supplemental analysis when more sensitive forms of disproportionality analysis were used. Two of three reports in the FAERS database were assessed as likely duplicate reports. We did not identify any additional reports in the FAERS corresponding to the three cases identified in the IP using EHRs. We did not identify additional published reports of pancreatitis associated with rasburicase. Discussion: Our exercise stimulated interesting discussions of key points in signal detection and evaluation, including causality assessment, signal detection algorithm performance, pharmacovigilance terminology, duplicate reporting, mechanisms for communicating signals, the structure of the FAERs database, and recent results from a major international pharmacovigilance research initiative. PMID:27298720

  10. A comparison of large-scale climate signals and the North American Multi-Model Ensemble (NMME) for drought prediction in China

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Chen, Nengcheng; Zhang, Xiang

    2018-02-01

    Drought is an extreme natural disaster that can lead to huge socioeconomic losses. Drought prediction ahead of months is helpful for early drought warning and preparations. In this study, we developed a statistical model, two weighted dynamic models and a statistical-dynamic (hybrid) model for 1-6 month lead drought prediction in China. Specifically, statistical component refers to climate signals weighting by support vector regression (SVR), dynamic components consist of the ensemble mean (EM) and Bayesian model averaging (BMA) of the North American Multi-Model Ensemble (NMME) climatic models, and the hybrid part denotes a combination of statistical and dynamic components by assigning weights based on their historical performances. The results indicate that the statistical and hybrid models show better rainfall predictions than NMME-EM and NMME-BMA models, which have good predictability only in southern China. In the 2011 China winter-spring drought event, the statistical model well predicted the spatial extent and severity of drought nationwide, although the severity was underestimated in the mid-lower reaches of Yangtze River (MLRYR) region. The NMME-EM and NMME-BMA models largely overestimated rainfall in northern and western China in 2011 drought. In the 2013 China summer drought, the NMME-EM model forecasted the drought extent and severity in eastern China well, while the statistical and hybrid models falsely detected negative precipitation anomaly (NPA) in some areas. Model ensembles such as multiple statistical approaches, multiple dynamic models or multiple hybrid models for drought predictions were highlighted. These conclusions may be helpful for drought prediction and early drought warnings in China.

  11. F-8C adaptive flight control extensions. [for maximum likelihood estimation

    NASA Technical Reports Server (NTRS)

    Stein, G.; Hartmann, G. L.

    1977-01-01

    An adaptive concept which combines gain-scheduled control laws with explicit maximum likelihood estimation (MLE) identification to provide the scheduling values is described. The MLE algorithm was improved by incorporating attitude data, estimating gust statistics for setting filter gains, and improving parameter tracking during changing flight conditions. A lateral MLE algorithm was designed to improve true air speed and angle of attack estimates during lateral maneuvers. Relationships between the pitch axis sensors inherent in the MLE design were examined and used for sensor failure detection. Design details and simulation performance are presented for each of the three areas investigated.

  12. Model-independent test for scale-dependent non-Gaussianities in the cosmic microwave background.

    PubMed

    Räth, C; Morfill, G E; Rossmanith, G; Banday, A J; Górski, K M

    2009-04-03

    We present a model-independent method to test for scale-dependent non-Gaussianities in combination with scaling indices as test statistics. Therefore, surrogate data sets are generated, in which the power spectrum of the original data is preserved, while the higher order correlations are partly randomized by applying a scale-dependent shuffling procedure to the Fourier phases. We apply this method to the Wilkinson Microwave Anisotropy Probe data of the cosmic microwave background and find signatures for non-Gaussianities on large scales. Further tests are required to elucidate the origin of the detected anomalies.

  13. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less

  14. Hyperspectral Imaging in Tandem with R Statistics and Image Processing for Detection and Visualization of pH in Japanese Big Sausages Under Different Storage Conditions.

    PubMed

    Feng, Chao-Hui; Makino, Yoshio; Yoshimura, Masatoshi; Thuyet, Dang Quoc; García-Martín, Juan Francisco

    2018-02-01

    The potential of hyperspectral imaging with wavelengths of 380 to 1000 nm was used to determine the pH of cooked sausages after different storage conditions (4 °C for 1 d, 35 °C for 1, 3, and 5 d). The mean spectra of the sausages were extracted from the hyperspectral images and partial least squares regression (PLSR) model was developed to relate spectral profiles with the pH of the cooked sausages. Eleven important wavelengths were selected based on the regression coefficient values. The PLSR model established using the optimal wavelengths showed good precision being the prediction coefficient of determination (R p 2 ) 0.909 and the root mean square error of prediction 0.035. The prediction map for illustrating pH indices in sausages was for the first time developed by R statistics. The overall results suggested that hyperspectral imaging combined with PLSR and R statistics are capable to quantify and visualize the sausages pH evolution under different storage conditions. In this paper, hyperspectral imaging is for the first time used to detect pH in cooked sausages using R statistics, which provides another useful information for the researchers who do not have the access to Matlab. Eleven optimal wavelengths were successfully selected, which were used for simplifying the PLSR model established based on the full wavelengths. This simplified model achieved a high R p 2 (0.909) and a low root mean square error of prediction (0.035), which can be useful for the design of multispectral imaging systems. © 2017 Institute of Food Technologists®.

  15. A survey about methods dedicated to epistasis detection.

    PubMed

    Niel, Clément; Sinoquet, Christine; Dina, Christian; Rocheleau, Ghislain

    2015-01-01

    During the past decade, findings of genome-wide association studies (GWAS) improved our knowledge and understanding of disease genetics. To date, thousands of SNPs have been associated with diseases and other complex traits. Statistical analysis typically looks for association between a phenotype and a SNP taken individually via single-locus tests. However, geneticists admit this is an oversimplified approach to tackle the complexity of underlying biological mechanisms. Interaction between SNPs, namely epistasis, must be considered. Unfortunately, epistasis detection gives rise to analytic challenges since analyzing every SNP combination is at present impractical at a genome-wide scale. In this review, we will present the main strategies recently proposed to detect epistatic interactions, along with their operating principle. Some of these methods are exhaustive, such as multifactor dimensionality reduction, likelihood ratio-based tests or receiver operating characteristic curve analysis; some are non-exhaustive, such as machine learning techniques (random forests, Bayesian networks) or combinatorial optimization approaches (ant colony optimization, computational evolution system).

  16. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  17. Colour and luminance contrasts predict the human detection of natural stimuli in complex visual environments.

    PubMed

    White, Thomas E; Rojas, Bibiana; Mappes, Johanna; Rautiala, Petri; Kemp, Darrell J

    2017-09-01

    Much of what we know about human colour perception has come from psychophysical studies conducted in tightly-controlled laboratory settings. An enduring challenge, however, lies in extrapolating this knowledge to the noisy conditions that characterize our actual visual experience. Here we combine statistical models of visual perception with empirical data to explore how chromatic (hue/saturation) and achromatic (luminant) information underpins the detection and classification of stimuli in a complex forest environment. The data best support a simple linear model of stimulus detection as an additive function of both luminance and saturation contrast. The strength of each predictor is modest yet consistent across gross variation in viewing conditions, which accords with expectation based upon general primate psychophysics. Our findings implicate simple visual cues in the guidance of perception amidst natural noise, and highlight the potential for informing human vision via a fusion between psychophysical modelling and real-world behaviour. © 2017 The Author(s).

  18. Practical steganalysis of digital images: state of the art

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav

    2002-04-01

    Steganography is the art of hiding the very presence of communication by embedding secret messages into innocuous looking cover documents, such as digital images. Detection of steganography, estimation of message length, and its extraction belong to the field of steganalysis. Steganalysis has recently received a great deal of attention both from law enforcement and the media. In our paper, we classify and review current stego-detection algorithms that can be used to trace popular steganographic products. We recognize several qualitatively different approaches to practical steganalysis - visual detection, detection based on first order statistics (histogram analysis), dual statistics methods that use spatial correlations in images and higher-order statistics (RS steganalysis), universal blind detection schemes, and special cases, such as JPEG compatibility steganalysis. We also present some new results regarding our previously proposed detection of LSB embedding using sensitive dual statistics. The recent steganalytic methods indicate that the most common paradigm in image steganography - the bit-replacement or bit substitution - is inherently insecure with safe capacities far smaller than previously thought.

  19. Photon strength and the low-energy enhancement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiedeking, M.; Bernstein, L. A.; Bleuel, D. L.

    2014-08-14

    Several measurements in medium mass nuclei have reported a low-energy enhancement in the photon strength function. Although, much effort has been invested in unraveling the mysteries of this effect, its physical origin is still not conclusively understood. Here, a completely model-independent experimental approach to investigate the existence of this enhancement is presented. The experiment was designed to study statistical feeding from the quasi-continuum (below the neutron separation energy) to individual low-lying discrete levels in {sup 95}Mo produced in the (d, p) reaction. A key aspect to successfully study gamma decay from the region of high-level density is the detection andmore » extraction of correlated particle-gamma-gamma events which was accomplished using an array of Clover HPGe detectors and large area annular silicon detectors. The entrance channel excitation energy into the residual nucleus produced in the reaction was inferred from the detected proton energies in the silicon detectors. Gating on gamma-transitions originating from low-lying discrete levels specifies the state fed by statistical gamma-rays. Any particle-gamma-gamma event in combination with specific energy sum requirements ensures a clean and unambiguous determination of the initial and final state of the observed gamma rays. With these requirements the statistical feeding to individual discrete levels is extracted on an event-by-event basis. The results are presented and compared to {sup 95}Mo photon strength function data measured at the University of Oslo.« less

  20. Detecting Noisy Events Using Waveform Cross-Correlation at Superarrays of Seismic Stations

    NASA Astrophysics Data System (ADS)

    von Seggern, D. H.; Tibuleac, I. M.

    2007-12-01

    Cross-correlation using master events, followed by stacking of the correlation series, has been shown to dramatically improve detection thresholds of small-to-medium seismic arrays. With the goal of lowering the detection threshold, determining relative magnitudes or moments, and characterizing sources by empirical Green's functions, we extend the cross-correlation methodology to include "superarrays" of seismic stations. The superarray concept naturally brings further benefits over conventional arrays and single-stations due to the fact that many distances and azimuths can be sampled. This extension is straightforward given the ease with which regional or global data from various stations or arrays can be currently accessed and combined into a single database. We demonstrate the capability of superarrays to detect and analyze events which lie below the detection threshold. This is aided by applying an F-statistic detector to the superarray cross-correlation stack and its components. Our first example illustrates the use of a superarray consisting of the Southern Great Basin Digital Seismic Network, a small-aperture array (NVAR) in Mina, Nevada and the Earthscope Transportable Array to detect events in California-Nevada areas. In our second example, we use a combination of small-to-medium arrays and single stations to study the rupture of the great Sumatra earthquake of 26 December 2004 and to detect its early aftershocks. The location and times of "detected" events are confirmed using a frequency- wavenumber method at the small-to-medium arrays. We propose that ad hoc superarrays can be used in many studies where conventional approaches previously used only single arrays or groups of single stations. The availability of near-real-time data from many networks and of archived data from, for instance, IRIS makes possible the easy assembly of superarrays. Furthermore, the continued improvement of seismic data availability and the continued growth in the number of world-wide seismic sensors will increasingly make superarrays an attractive choice for many studies.

  1. A combined pre-clinical meta-analysis and randomized confirmatory trial approach to improve data validity for therapeutic target validation.

    PubMed

    Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W

    2015-08-27

    Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.

  2. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    DOE PAGES

    Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...

    2006-01-01

    The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less

  3. Applying a statistical PTB detection procedure to complement the gold standard.

    PubMed

    Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd

    2011-04-01

    This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Comparative SEM analysis of nine F22 aligner cleaning strategies.

    PubMed

    Lombardo, Luca; Martini, Marco; Cervinara, Francesca; Spedicato, Giorgio Alfredo; Oliverio, Teresa; Siciliani, Giuseppe

    2017-12-01

    The orthodontics industry has paid great attention to the aesthetics of orthodontic appliances, seeking to make them as invisible as possible. There are several advantages to clear aligner systems, including aesthetics, comfort, chairside time reduction, and the fact that they can be removed for meals and oral hygiene procedures. Five patients were each given a series of F22 aligners, each to be worn for 14 days and nights, with the exception of meal and brushing times. Patients were instructed to clean each aligner using a prescribed strategy, and sections of the used aligners were observed under SEM. One grey-scale SEM image was saved per aligner in JPEG format with an 8-bit colour depth, and a total of 45 measurements on the grey scale ("Value" variable) were made. This dataset was analysed statistically via repeated measures ANOVA to determine the effect of each of the nine cleaning strategies in each of the five patients. A statistically significant difference in the efficacy of the cleaning strategies was detected. Specifically, rinsing with water alone was significantly less efficacious, and a combination of cationic detergent solution and ultrasonication was significantly more efficacious than the other methods (p < 0.05). Of the nine cleaning strategies examined, only that involving 5 min of ultrasonication at 42 k Hz combined with a 0.3% germicidal cationic detergent was observed to be statistically effective at removing the bacterial biofilm from the surface of F22 aligners.

  5. Absolute plate motions relative to deep mantle plumes

    NASA Astrophysics Data System (ADS)

    Wang, Shimin; Yu, Hongzheng; Zhang, Qiong; Zhao, Yonghong

    2018-05-01

    Advances in whole waveform seismic tomography have revealed the presence of broad mantle plumes rooted at the base of the Earth's mantle beneath major hotspots. Hotspot tracks associated with these deep mantle plumes provide ideal constraints for inverting absolute plate motions as well as testing the fixed hotspot hypothesis. In this paper, 27 observed hotspot trends associated with 24 deep mantle plumes are used together with the MORVEL model for relative plate motions to determine an absolute plate motion model, in terms of a maximum likelihood optimization for angular data fitting, combined with an outlier data detection procedure based on statistical tests. The obtained T25M model fits 25 observed trends of globally distributed hotspot tracks to the statistically required level, while the other two hotspot trend data (Comores on Somalia and Iceland on Eurasia) are identified as outliers, which are significantly incompatible with other data. For most hotspots with rate data available, T25M predicts plate velocities significantly lower than the observed rates of hotspot volcanic migration, which cannot be fully explained by biased errors in observed rate data. Instead, the apparent hotspot motions derived by subtracting the observed hotspot migration velocities from the T25M plate velocities exhibit a combined pattern of being opposite to plate velocities and moving towards mid-ocean ridges. The newly estimated net rotation of the lithosphere is statistically compatible with three recent estimates, but differs significantly from 30 of 33 prior estimates.

  6. Enhancing the mathematical properties of new haplotype homozygosity statistics for the detection of selective sweeps.

    PubMed

    Garud, Nandita R; Rosenberg, Noah A

    2015-06-01

    Soft selective sweeps represent an important form of adaptation in which multiple haplotypes bearing adaptive alleles rise to high frequency. Most statistical methods for detecting selective sweeps from genetic polymorphism data, however, have focused on identifying hard selective sweeps in which a favored allele appears on a single haplotypic background; these methods might be underpowered to detect soft sweeps. Among exceptions is the set of haplotype homozygosity statistics introduced for the detection of soft sweeps by Garud et al. (2015). These statistics, examining frequencies of multiple haplotypes in relation to each other, include H12, a statistic designed to identify both hard and soft selective sweeps, and H2/H1, a statistic that conditional on high H12 values seeks to distinguish between hard and soft sweeps. A challenge in the use of H2/H1 is that its range depends on the associated value of H12, so that equal H2/H1 values might provide different levels of support for a soft sweep model at different values of H12. Here, we enhance the H12 and H2/H1 haplotype homozygosity statistics for selective sweep detection by deriving the upper bound on H2/H1 as a function of H12, thereby generating a statistic that normalizes H2/H1 to lie between 0 and 1. Through a reanalysis of resequencing data from inbred lines of Drosophila, we show that the enhanced statistic both strengthens interpretations obtained with the unnormalized statistic and leads to empirical insights that are less readily apparent without the normalization. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Unsupervised change detection of multispectral images based on spatial constraint chi-squared transform and Markov random field model

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; Wang, Chao; Shen, Shaohong; Huang, Fengchen; Ma, Zhenli

    2016-10-01

    Chi-squared transform (CST), as a statistical method, can describe the difference degree between vectors. The CST-based methods operate directly on information stored in the difference image and are simple and effective methods for detecting changes in remotely sensed images that have been registered and aligned. However, the technique does not take spatial information into consideration, which leads to much noise in the result of change detection. An improved unsupervised change detection method is proposed based on spatial constraint CST (SCCST) in combination with a Markov random field (MRF) model. First, the mean and variance matrix of the difference image of bitemporal images are estimated by an iterative trimming method. In each iteration, spatial information is injected to reduce scattered changed points (also known as "salt and pepper" noise). To determine the key parameter confidence level in the SCCST method, a pseudotraining dataset is constructed to estimate the optimal value. Then, the result of SCCST, as an initial solution of change detection, is further improved by the MRF model. The experiments on simulated and real multitemporal and multispectral images indicate that the proposed method performs well in comprehensive indices compared with other methods.

  8. Glaucoma risk index: automated glaucoma detection from color fundus images.

    PubMed

    Bock, Rüdiger; Meier, Jörg; Nyúl, László G; Hornegger, Joachim; Michelson, Georg

    2010-06-01

    Glaucoma as a neurodegeneration of the optic nerve is one of the most common causes of blindness. Because revitalization of the degenerated nerve fibers of the optic nerve is impossible early detection of the disease is essential. This can be supported by a robust and automated mass-screening. We propose a novel automated glaucoma detection system that operates on inexpensive to acquire and widely used digital color fundus images. After a glaucoma specific preprocessing, different generic feature types are compressed by an appearance-based dimension reduction technique. Subsequently, a probabilistic two-stage classification scheme combines these features types to extract the novel Glaucoma Risk Index (GRI) that shows a reasonable glaucoma detection performance. On a sample set of 575 fundus images a classification accuracy of 80% has been achieved in a 5-fold cross-validation setup. The GRI gains a competitive area under ROC (AUC) of 88% compared to the established topography-based glaucoma probability score of scanning laser tomography with AUC of 87%. The proposed color fundus image-based GRI achieves a competitive and reliable detection performance on a low-priced modality by the statistical analysis of entire images of the optic nerve head. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  9. Remote Structural Health Monitoring and Advanced Prognostics of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Brown; Bernard Laskowski

    The prospect of substantial investment in wind energy generation represents a significant capital investment strategy. In order to maximize the life-cycle of wind turbines, associated rotors, gears, and structural towers, a capability to detect and predict (prognostics) the onset of mechanical faults at a sufficiently early stage for maintenance actions to be planned would significantly reduce both maintenance and operational costs. Advancement towards this effort has been made through the development of anomaly detection, fault detection and fault diagnosis routines to identify selected fault modes of a wind turbine based on available sensor data preceding an unscheduled emergency shutdown. Themore » anomaly detection approach employs spectral techniques to find an approximation of the data using a combination of attributes that capture the bulk of variability in the data. Fault detection and diagnosis (FDD) is performed using a neural network-based classifier trained from baseline and fault data recorded during known failure conditions. The approach has been evaluated for known baseline conditions and three selected failure modes: pitch rate failure, low oil pressure failure and a gearbox gear-tooth failure. Experimental results demonstrate the approach can distinguish between these failure modes and normal baseline behavior within a specified statistical accuracy.« less

  10. Automated thematic mapping and change detection of ERTS-A images. [digital interpretation of Arizona imagery

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. For the recognition of terrain types, spatial signatures are developed from the diffraction patterns of small areas of ERTS-1 images. This knowledge is exploited for the measurements of a small number of meaningful spatial features from the digital Fourier transforms of ERTS-1 image cells containing 32 x 32 picture elements. Using these spatial features and a heuristic algorithm, the terrain types in the vicinity of Phoenix, Arizona were recognized by the computer with a high accuracy. Then, the spatial features were combined with spectral features and using the maximum likelihood criterion the recognition accuracy of terrain types increased substantially. It was determined that the recognition accuracy with the maximum likelihood criterion depends on the statistics of the feature vectors. Nonlinear transformations of the feature vectors are required so that the terrain class statistics become approximately Gaussian. It was also determined that for a given geographic area the statistics of the classes remain invariable for a period of a month but vary substantially between seasons.

  11. Pathogenicity of facultative and obligate anaerobic bacteria in monoculture and combined with either Prevotella intermedia or Prevotella nigrescens.

    PubMed

    Siqueira, J F; Magalhães, F A; Lima, K C; de Uzeda, M

    1998-12-01

    The pathogenicity of obligate and facultative anaerobic bacteria commonly found in endodontic infections was tested using a mouse model. The capacity of inducing abscesses was evaluated seven days after subcutaneous injection of the bacteria in pure culture and in combinations with either Prevotella intermedia or Prevotella nigrescens. Nine of the fifteen bacterial strains tested were pathogenic in pure culture. No statistically significant differences were detected between these strains in pure culture and in mixtures with either P. intermedia or P. nigrescens. Synergism between the bacterial strains was only apparent when associating Porphyromonas endodontalis with P. intermedia or P. nigrescens. Histopathological examination of tissue sections from induced abscesses revealed an acute inflammatory reaction, dominated by polymorphonuclear leukocytes. Sections from the control group using sterile medium showed no evidence of inflammatory reaction.

  12. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  13. Statistical Track-Before-Detect Methods Applied to Faint Optical Observations of Resident Space Objects

    NASA Astrophysics Data System (ADS)

    Fujimoto, K.; Yanagisawa, T.; Uetsuhara, M.

    Automated detection and tracking of faint objects in optical, or bearing-only, sensor imagery is a topic of immense interest in space surveillance. Robust methods in this realm will lead to better space situational awareness (SSA) while reducing the cost of sensors and optics. They are especially relevant in the search for high area-to-mass ratio (HAMR) objects, as their apparent brightness can change significantly over time. A track-before-detect (TBD) approach has been shown to be suitable for faint, low signal-to-noise ratio (SNR) images of resident space objects (RSOs). TBD does not rely upon the extraction of feature points within the image based on some thresholding criteria, but rather directly takes as input the intensity information from the image file. Not only is all of the available information from the image used, TBD avoids the computational intractability of the conventional feature-based line detection (i.e., "string of pearls") approach to track detection for low SNR data. Implementation of TBD rooted in finite set statistics (FISST) theory has been proposed recently by Vo, et al. Compared to other TBD methods applied so far to SSA, such as the stacking method or multi-pass multi-period denoising, the FISST approach is statistically rigorous and has been shown to be more computationally efficient, thus paving the path toward on-line processing. In this paper, we intend to apply a multi-Bernoulli filter to actual CCD imagery of RSOs. The multi-Bernoulli filter can explicitly account for the birth and death of multiple targets in a measurement arc. TBD is achieved via a sequential Monte Carlo implementation. Preliminary results with simulated single-target data indicate that a Bernoulli filter can successfully track and detect objects with measurement SNR as low as 2.4. Although the advent of fast-cadence scientific CMOS sensors have made the automation of faint object detection a realistic goal, it is nonetheless a difficult goal, as measurements arcs in space surveillance are often both short and sparse. FISST methodologies have been applied to the general problem of SSA by many authors, but they generally focus on tracking scenarios with long arcs or assume that line detection is tractable. We will instead focus this work on estimating sensor-level kinematics of RSOs for low SNR too-short arc observations. Once said estimate is made available, track association and simultaneous initial orbit determination may be achieved via any number of proposed solutions to the too-short arc problem, such as those incorporating the admissible region. We show that the benefit of combining FISST-based TBD with too-short arc association goes both ways; i.e., the former provides consistent statistics regarding bearing-only measurements, whereas the latter makes better use of the precise dynamical models nominally applicable to RSOs in orbit determination.

  14. A spatial scan statistic for multiple clusters.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2011-10-01

    Spatial scan statistics are commonly used for geographical disease surveillance and cluster detection. While there are multiple clusters coexisting in the study area, they become difficult to detect because of clusters' shadowing effect to each other. The recently proposed sequential method showed its better power for detecting the second weaker cluster, but did not improve the ability of detecting the first stronger cluster which is more important than the second one. We propose a new extension of the spatial scan statistic which could be used to detect multiple clusters. Through constructing two or more clusters in the alternative hypothesis, our proposed method accounts for other coexisting clusters in the detecting and evaluating process. The performance of the proposed method is compared to the sequential method through an intensive simulation study, in which our proposed method shows better power in terms of both rejecting the null hypothesis and accurately detecting the coexisting clusters. In the real study of hand-foot-mouth disease data in Pingdu city, a true cluster town is successfully detected by our proposed method, which cannot be evaluated to be statistically significant by the standard method due to another cluster's shadowing effect. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  16. Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.

    PubMed

    Bui, Thanh Quang; Pham, Hai Minh

    2016-01-01

    There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.

  17. Presepsin (Scd14) as a Marker of Serious Bacterial Infections in Chemotherapy Induced Severe Neutropenia

    PubMed Central

    Olad, Elham; Sedighi, Iraj; Mehrvar, Azim; Tashvighi, Maryam; Fallahazad, Vahid; Hedayatiasl, Amirabbas; Esfahani, Hossein

    2014-01-01

    Objective: Timely detection of serious bacterial infections or prediction of sepsis and death is of paramount importance in neutropenic patients especially in oncology settings. The aim of this study was to determine a rapid and secure predictor of sepsis in severe neutropenic cancer children. Methods: In addition to blood culture, we have evaluated serum soluble CD14 on this role and measured it in 39 neutropenic episodes in Mahak pediatric oncology center from September 2012 to January 2013. Fifteen episodes had positive bacterial cultures and 18 had fever. The mean sCD14 values were compared in the presence or absence of fever, positive blood culture and other clinical conditions. Also, mean levels compared in different white cell counts and different four combination settings of fever and blood culture. Findings: It was statistically higher in febrile episodes, in the presence of oral mucositis, indwelling catheter infection, otitis media, and post toxic epidermal necrolysis sepsis and in instances of death within 15 days. Leukocyte count did not affect sCD14 level and in combinations of fever and blood culture, mean sCD14 values were ranked as follow: febrile culture negatives, febrile culture positives, afebrile culture positives and afebrile culture negatives. Conclusion: Although sCD14 was not sensitive in detection of bacteremia, in the absence of clinically detectable source of infection, it was significantly higher in culture positives. PMID:26019777

  18. Detection of nonlinear transfer functions by the use of Gaussian statistics

    NASA Technical Reports Server (NTRS)

    Sheppard, J. G.

    1972-01-01

    The possibility of using on-line signal statistics to detect electronic equipment nonlinearities is discussed. The results of an investigation using Gaussian statistics are presented, and a nonlinearity test that uses ratios of the moments of a Gaussian random variable is developed and discussed. An outline for further investigation is presented.

  19. Fusion of Remote Sensing Methods, UAV Photogrammetry and LiDAR Scanning products for monitoring fluvial dynamics

    NASA Astrophysics Data System (ADS)

    Lendzioch, Theodora; Langhammer, Jakub; Hartvich, Filip

    2015-04-01

    Fusion of remote sensing data is a common and rapidly developing discipline, which combines data from multiple sources with different spatial and spectral resolution, from satellite sensors, aircraft and ground platforms. Fusion data contains more detailed information than each of the source and enhances the interpretation performance and accuracy of the source data and produces a high-quality visualisation of the final data. Especially, in fluvial geomorphology it is essential to get valuable images in sub-meter resolution to obtain high quality 2D and 3D information for a detailed identification, extraction and description of channel features of different river regimes and to perform a rapid mapping of changes in river topography. In order to design, test and evaluate a new approach for detection of river morphology, we combine different research techniques from remote sensing products to drone-based photogrammetry and LiDAR products (aerial LiDAR Scanner and TLS). Topographic information (e.g. changes in river channel morphology, surface roughness, evaluation of floodplain inundation, mapping gravel bars and slope characteristics) will be extracted either from one single layer or from combined layers in accordance to detect fluvial topographic changes before and after flood events. Besides statistical approaches for predictive geomorphological mapping and the determination of errors and uncertainties of the data, we will also provide 3D modelling of small fluvial features.

  20. Evaluation of a combined triple method to detect causative HPV in oral and oropharyngeal squamous cell carcinomas: p16 Immunohistochemistry, Consensus PCR HPV-DNA, and In Situ Hybridization

    PubMed Central

    2012-01-01

    Background Recent emerging evidences identify Human Papillomavirus (HPV) related Head and Neck squamous cell carcinomas (HN-SCCs) as a separate subgroup among Head and Neck Cancers with different epidemiology, histopathological characteristics, therapeutic response to chemo-radiation treatment and clinical outcome. However, there is not a worldwide consensus on the methods to be used in clinical practice. The endpoint of this study was to demonstrate the reliability of a triple method which combines evaluation of: 1. p16 protein expression by immunohistochemistry (p16-IHC); 2. HPV-DNA genotyping by consensus HPV-DNA PCR methods (Consensus PCR); and 3 viral integration into the host by in situ hybridization method (ISH). This triple method has been applied to HN-SCC originated from oral cavity (OSCC) and oropharynx (OPSCC), the two anatomical sites in which high risk (HR) HPVs have been clearly implicated as etiologic factors. Methylation-Specific PCR (MSP) was performed to study inactivation of p16-CDKN2a locus by epigenetic events. Reliability of multiple methods was measured by Kappa statistics. Results All the HN-SCCs confirmed HPV positive by PCR and/or ISH were also p16 positive by IHC, with the latter showing a very high level of sensitivity as single test (100% in both OSCC and OPSCC) but lower specificity level (74% in OSCC and 93% in OPSCC). Concordance analysis between ISH and Consensus PCR showed a faint agreement in OPSCC (κ = 0.38) and a moderate agreement in OSCC (κ = 0.44). Furthermore, the addition of double positive score (ISHpositive and Consensus PCR positive) increased significantly the specificity of HR-HPV detection on formalin-fixed paraffin embedded (FFPE) samples (100% in OSCC and 78.5% in OPSCC), but reduced the sensitivity (33% in OSCC and 60% in OPSCC). The significant reduction of sensitivity by the double method was compensated by a very high sensitivity of p16-IHC detection in the triple approach. Conclusions Although HR-HPVs detection is of utmost importance in clinical settings for the Head and Neck Cancer patients, there is no consensus on which to consider the 'golden standard' among the numerous detection methods available either as single test or combinations. Until recently, quantitative E6 RNA PCR has been considered the 'golden standard' since it was demonstrated to have very high accuracy level and very high statistical significance associated with prognostic parameters. In contrast, quantitative E6 DNA PCR has proven to have very high level of accuracy but lesser prognostic association with clinical outcome than the HPV E6 oncoprotein RNA PCR. However, although it is theoretically possible to perform quantitative PCR detection methods also on FFPE samples, they reach the maximum of accuracy on fresh frozen tissue. Furthermore, worldwide diagnostic laboratories have not all the same ability to analyze simultaneously both FFPE and fresh tissues with these quantitative molecular detection methods. Therefore, in the current clinical practice a p16-IHC test is considered as sufficient for HPV diagnostic in accordance with the recently published Head and Neck Cancer international guidelines. Although p16-IHC may serve as a good prognostic indicator, our study clearly demonstrated that it is not satisfactory when used exclusively as the only HPV detecting method. Adding ISH, although known as less sensitive than PCR-based detection methods, has the advantage to preserve the morphological context of HPV-DNA signals in FFPE samples and, thus increase the overall specificity of p16/Consensus PCR combination tests. PMID:22376902

  1. Fusion of local and global detection systems to detect tuberculosis in chest radiographs.

    PubMed

    Hogeweg, Laurens; Mol, Christian; de Jong, Pim A; Dawson, Rodney; Ayles, Helen; van Ginneken, Bramin

    2010-01-01

    Automatic detection of tuberculosis (TB) on chest radiographs is a difficult problem because of the diverse presentation of the disease. A combination of detection systems for abnormalities and normal anatomy is used to improve detection performance. A textural abnormality detection system operating at the pixel level is combined with a clavicle detection system to suppress false positive responses. The output of a shape abnormality detection system operating at the image level is combined in a next step to further improve performance by reducing false negatives. Strategies for combining systems based on serial and parallel configurations were evaluated using the minimum, maximum, product, and mean probability combination rules. The performance of TB detection increased, as measured using the area under the ROC curve, from 0.67 for the textural abnormality detection system alone to 0.86 when the three systems were combined. The best result was achieved using the sum and product rule in a parallel combination of outputs.

  2. An automated multi-scale network-based scheme for detection and location of seismic sources

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.

    2017-12-01

    We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.

  3. Linear and Order Statistics Combiners for Pattern Classification

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep; Lau, Sonie (Technical Monitor)

    2001-01-01

    Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This chapter provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and order statistics combiners. We first show that to a first order approximation, the error rate obtained over and above the Bayes error rate, is directly proportional to the variance of the actual decision boundaries around the Bayes optimum boundary. Combining classifiers in output space reduces this variance, and hence reduces the 'added' error. If N unbiased classifiers are combined by simple averaging. the added error rate can be reduced by a factor of N if the individual errors in approximating the decision boundaries are uncorrelated. Expressions are then derived for linear combiners which are biased or correlated, and the effect of output correlations on ensemble performance is quantified. For order statistics based non-linear combiners, we derive expressions that indicate how much the median, the maximum and in general the i-th order statistic can improve classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions, and combining in output space. Experimental results on several public domain data sets are provided to illustrate the benefits of combining and to support the analytical results.

  4. Multimodal fiber-probe spectroscopy for the diagnostics and classification of bladder tumors

    NASA Astrophysics Data System (ADS)

    Anand, Suresh; Cicchi, Riccardo; Fantechi, Riccardo; Gacci, Mauro; Nesi, Gabriella; Carini, Marco; Pavone, Francesco S.

    2017-02-01

    The gold standard for the detection of bladder cancer is white light cystoscopy, followed by an invasive biopsy and pathological examination. Tissue pathology is time consuming and often prone to sampling errors. Recently, optical spectroscopy techniques have evolved as promising techniques for the detection of neoplasia. The specific goal of this study is to evaluate the application of combined auto-fluorescence (excited using 378 nm and 445 nm wavelengths) and diffuse reflectance spectroscopy to discriminate normal bladder tissue from tumor at different grades. The fluorescence spectrum at both excitation wavelengths showed an increased spectral intensity in tumors with respect to normal tissues. Reflectance data indicated an increased reflectance in the wavelength range 610 nm - 700 nm for different grades of tumors, compared to normal tissues. The spectral data were further analyzed using principal component analysis for evaluating the sensitivity and specificity for diagnosing tumor. The spectral differences observed between various grades of tumors provides a strong genesis for the future evaluation on a larger patient population to achieve statistical significance. This study indicates that a combined spectroscopic strategy, incorporating fluorescence and reflectance spectroscopy, could improve the capability for diagnosing bladder tumor as well as for differentiating tumors in different grades.

  5. [Genetic polymorphism and forensic application of 30 InDel loci of Han population in Beijing].

    PubMed

    Bai, Ru-Feng; Jiang, Li-Zhe; Zhang, Zhong; Shi, Mei-Sen

    2013-12-01

    To study the genetic diversities of 30 insertion-deletion (InDel) polymorphisms loci of Han population in Beijing, and to evaluate their forensic application, 210 unrelated healthy individuals of Han population in Beijing were investigated to determine the distributions of allele frequencies by using Investigator DIP system. The PCR products were detected with ABI 3130 XL Genetic Analyzer. Forensic parameters were calculated with relevant statistical analysis software. As a result, after the Bonferroni correction at a 95% significance level, there were no significant departures from Hardy-Weinberg equilibrium or significant linkage disequilibrium between the loci. The power of discrimination (DP) varies between 0.2690 (HLD118) and 0.6330 (HLD45), and the combined discrimination power (TDP) for the 30 InDel loci is 0.999999999985. The combined power of exclusion was 0.98771049 in trio cases (CPE(trio)) and 0.94579456 in duo cases (CPE(duo)). The parentage testing of 32 cases revealed no mutations happened to 30 InDel loci. Multiplex detection of the 30 InDel loci revealed a highly polymorphic genetic distribution in Beijing Han population, which represents a complementary tool in human identification studies, especially in challenging DNA cases.

  6. [Preliminary study of bonding strength between diatomite-based dental ceramic and veneering porcelains].

    PubMed

    Lu, Xiao-li; Gao, Mei-qin; Cheng, Yu-ye; Zhang, Fei-min

    2015-04-01

    In order to choose the best veneering porcelain for diatomite-based dental ceramic substrate, the bonding strength between diatomite-based dental ceramics and veneering porcelains was measured, and the microstructure and elements distribution of interface were analyzed. The coefficient of thermal expansion (CTE) of diatomite-based dental ceramics was detected by dilatometry. Three veneering porcelain materials were selected with the best CTE matching including alumina veneering porcelain (group A), titanium porcelain veneering porcelain (group B), and E-max veneering porcelain (group C). Shear bonding strength was detected. SEM and EDS were used to observe the interface microstructure and element distribution. Statistical analysis was performed using SPSS 17.0 software package. The CTE of diatomite-based dental ceramics at 25-500 degrees centigrade was 8.85×10-6K-1. The diatomite-based substrate ceramics combined best with group C. Shear bonding strength between group A and C and group B and C both showed significant differences(P<0.05). SEM and EDS showed that the interface of group C sintered tightly and elements permeated on both sides of the interface. The diatomite-based substrate ceramics combines better with E-max porcelain veneer.

  7. Dynamics of hepatitis C under optimal therapy and sampling based analysis

    NASA Astrophysics Data System (ADS)

    Pachpute, Gaurav; Chakrabarty, Siddhartha P.

    2013-08-01

    We examine two models for hepatitis C viral (HCV) dynamics, one for monotherapy with interferon (IFN) and the other for combination therapy with IFN and ribavirin. Optimal therapy for both the models is determined using the steepest gradient method, by defining an objective functional which minimizes infected hepatocyte levels, virion population and side-effects of the drug(s). The optimal therapies for both the models show an initial period of high efficacy, followed by a gradual decline. The period of high efficacy coincides with a significant decrease in the viral load, whereas the efficacy drops after hepatocyte levels are restored. We use the Latin hypercube sampling technique to randomly generate a large number of patient scenarios and study the dynamics of each set under the optimal therapy already determined. Results show an increase in the percentage of responders (indicated by drop in viral load below detection levels) in case of combination therapy (72%) as compared to monotherapy (57%). Statistical tests performed to study correlations between sample parameters and time required for the viral load to fall below detection level, show a strong monotonic correlation with the death rate of infected hepatocytes, identifying it to be an important factor in deciding individual drug regimens.

  8. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    PubMed Central

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  9. Relative risk estimates from spatial and space-time scan statistics: Are they biased?

    PubMed Central

    Prates, Marcos O.; Kulldorff, Martin; Assunção, Renato M.

    2014-01-01

    The purely spatial and space-time scan statistics have been successfully used by many scientists to detect and evaluate geographical disease clusters. Although the scan statistic has high power in correctly identifying a cluster, no study has considered the estimates of the cluster relative risk in the detected cluster. In this paper we evaluate whether there is any bias on these estimated relative risks. Intuitively, one may expect that the estimated relative risks has upward bias, since the scan statistic cherry picks high rate areas to include in the cluster. We show that this intuition is correct for clusters with low statistical power, but with medium to high power the bias becomes negligible. The same behaviour is not observed for the prospective space-time scan statistic, where there is an increasing conservative downward bias of the relative risk as the power to detect the cluster increases. PMID:24639031

  10. Autoregressive statistical pattern recognition algorithms for damage detection in civil structures

    NASA Astrophysics Data System (ADS)

    Yao, Ruigen; Pakzad, Shamim N.

    2012-08-01

    Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.

  11. Sensitivity to structure in action sequences: An infant event-related potential study.

    PubMed

    Monroy, Claire D; Gerson, Sarah A; Domínguez-Martínez, Estefanía; Kaduk, Katharina; Hunnius, Sabine; Reid, Vincent

    2017-05-06

    Infants are sensitive to structure and patterns within continuous streams of sensory input. This sensitivity relies on statistical learning, the ability to detect predictable regularities in spatial and temporal sequences. Recent evidence has shown that infants can detect statistical regularities in action sequences they observe, but little is known about the neural process that give rise to this ability. In the current experiment, we combined electroencephalography (EEG) with eye-tracking to identify electrophysiological markers that indicate whether 8-11-month-old infants detect violations to learned regularities in action sequences, and to relate these markers to behavioral measures of anticipation during learning. In a learning phase, infants observed an actor performing a sequence featuring two deterministic pairs embedded within an otherwise random sequence. Thus, the first action of each pair was predictive of what would occur next. One of the pairs caused an action-effect, whereas the second did not. In a subsequent test phase, infants observed another sequence that included deviant pairs, violating the previously observed action pairs. Event-related potential (ERP) responses were analyzed and compared between the deviant and the original action pairs. Findings reveal that infants demonstrated a greater Negative central (Nc) ERP response to the deviant actions for the pair that caused the action-effect, which was consistent with their visual anticipations during the learning phase. Findings are discussed in terms of the neural and behavioral processes underlying perception and learning of structured action sequences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Quality Matters: Systematic Analysis of Endpoints Related to “Cellular Life” in Vitro Data of Radiofrequency Electromagnetic Field Exposure

    PubMed Central

    Simkó, Myrtill; Remondini, Daniel; Zeni, Olga; Scarfi, Maria Rosaria

    2016-01-01

    Possible hazardous effects of radiofrequency electromagnetic fields (RF-EMF) at low exposure levels are controversially discussed due to inconsistent study findings. Therefore, the main focus of the present study is to detect if any statistical association exists between RF-EMF and cellular responses, considering cell proliferation and apoptosis endpoints separately and with both combined as a group of “cellular life” to increase the statistical power of the analysis. We searched for publications regarding RF-EMF in vitro studies in the PubMed database for the period 1995–2014 and extracted the data to the relevant parameters, such as cell culture type, frequency, exposure duration, SAR, and five exposure-related quality criteria. These parameters were used for an association study with the experimental outcome in terms of the defined endpoints. We identified 104 published articles, from which 483 different experiments were extracted and analyzed. Cellular responses after exposure to RF-EMF were significantly associated to cell lines rather than to primary cells. No other experimental parameter was significantly associated with cellular responses. A highly significant negative association with exposure condition-quality and cellular responses was detected, showing that the more the quality criteria requirements were satisfied, the smaller the number of detected cellular responses. According to our knowledge, this is the first systematic analysis of specific RF-EMF bio-effects in association to exposure quality, highlighting the need for more stringent quality procedures for the exposure conditions. PMID:27420084

  13. Reliability of unstable periodic orbit based control strategies in biological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, Nagender; Singh, Harinder P.; Hasse, Maria

    2015-04-15

    Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry ofmore » the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.« less

  14. Can There Ever Be Enough to Impact Water Quality? Evaluating BMPs in Elliot Ditch, Indiana Using the LTHIA-LID Model

    NASA Astrophysics Data System (ADS)

    Rahman, M. S.; Hoover, F. A.; Bowling, L. C.

    2017-12-01

    Elliot Ditch is an urban/urbanizing watershed located in the city of Lafayette, IN, USA. The city continues to struggle with stormwater management and combined sewer overflow (CSO) events. Several best-management practices (BMP) such as rain gardens, green roofs, and bioswales have been implemented in the watershed, but the level of adoption needed to achieve meaningful impact is currently unknown. This study's goal is to determine what level of BMP coverage is needed to impact water quality, whether meaningful impact is determined by achieving water quality targets or statistical significance. A power analysis was performed using water quality data for total suspended solids (TSS), E.coli, total phosphorus (TP) and nitrate (NO3-N) from Elliot Ditch from 2011 to 2015. The minimum detectable difference (MDD) was calculated as the percent reduction in load needed to detect a significant change in the watershed. The water quality targets were proposed by stakeholders as part of a watershed management planning process. The water quality targets and the MDD percentages were then compared to simulated load reductions due to BMP implementation using the Long-term Hydrologic Impact Assessment-Low Impact Development (LTHIA-LID) model. Seven baseline model scenarios were simulated by implementing the maximum number of each of six types of BMPs (rain barrels, permeable patios, green roofs, grassed swale/bioswales, bioretention/rain gardens, and porous pavement), as well as all the practices combined in the watershed. These provide the baseline for targeted implementation scenarios designed to determine if statistically and physically meaningful load reductions can be achieved through BMP implementation alone.

  15. Controlled, prospective, randomized, clinical split-mouth evaluation of partial ceramic crowns luted with a new, universal adhesive system/resin cement: results after 18 months.

    PubMed

    Vogl, Vanessa; Hiller, Karl-Anton; Buchalla, Wolfgang; Federlin, Marianne; Schmalz, Gottfried

    2016-12-01

    A new universal adhesive with corresponding luting composite was recently marketed which can be used both, in a self-etch or in an etch-and-rinse mode. In this study, the clinical performance of partial ceramic crowns (PCCs) inserted with this adhesive and the corresponding luting material used in a self-etch or selective etch approach was compared with a self-adhesive universal luting material. Three PCCs were placed in a split-mouth design in 50 patients. Two PCCs were luted with a combination of a universal adhesive/resin cement (Scotchbond Universal/RelyX Ultimate, 3M ESPE) with (SB+E)/without (SB-E) selective enamel etching. Another PCC was luted with a self-adhesive resin cement (RelyX Unicem 2, 3M ESPE). Forty-eight patients were evaluated clinically according to FDI criteria at baseline and 6, 12 and 18 months. For statistical analyses, the chi-square test (α = 0.05) and Kaplan-Meier analysis were applied. Clinically, no statistically significant differences between groups were detected over time. Within groups, clinically significant increase for criterion "marginal staining" was detected for SB-E over 18 months. Kaplan-Meier analysis revealed significantly higher retention rates for SB+E (97.8 %) and SB-E (95.6 %) in comparison to RXU2 (75.6 %). The 18-month clinical performance of a new universal adhesive/composite combination showed no differences with respect to bonding strategy and may be recommended for luting PCCs. Longer-term evaluation is needed to confirm superiority of SB+E over SB-E. At 18 months, the new multi-mode adhesive, Scotchbond Universal, showed clinically reliable results when used for luting PCCs.

  16. A novel Brain Computer Interface for classification of social joint attention in autism and comparison of 3 experimental setups: A feasibility study.

    PubMed

    Amaral, Carlos P; Simões, Marco A; Mouga, Susana; Andrade, João; Castelo-Branco, Miguel

    2017-10-01

    We present a novel virtual-reality P300-based Brain Computer Interface (BCI) paradigm using social cues to direct the focus of attention. We combined interactive immersive virtual-reality (VR) technology with the properties of P300 signals in a training tool which can be used in social attention disorders such as autism spectrum disorder (ASD). We tested the novel social attention training paradigm (P300-based BCI paradigm for rehabilitation of joint-attention skills) in 13 healthy participants, in 3 EEG systems. The more suitable setup was tested online with 4 ASD subjects. Statistical accuracy was assessed based on the detection of P300, using spatial filtering and a Naïve-Bayes classifier. We compared: 1 - g.Mobilab+ (active dry-electrodes, wireless transmission); 2 - g.Nautilus (active electrodes, wireless transmission); 3 - V-Amp with actiCAP Xpress dry-electrodes. Significant statistical classification was achieved in all systems. g.Nautilus proved to be the best performing system in terms of accuracy in the detection of P300, preparation time, speed and reported comfort. Proof of concept tests in ASD participants proved that this setup is feasible for training joint attention skills in ASD. This work provides a unique combination of 'easy-to-use' BCI systems with new technologies such as VR to train joint-attention skills in autism. Our P300 BCI paradigm is feasible for future Phase I/II clinical trials to train joint-attention skills, with successful classification within few trials, online in ASD participants. The g.Nautilus system is the best performing one to use with the developed BCI setup. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Evaluation of the Gini Coefficient in Spatial Scan Statistics for Detecting Irregularly Shaped Clusters

    PubMed Central

    Kim, Jiyu; Jung, Inkyung

    2017-01-01

    Spatial scan statistics with circular or elliptic scanning windows are commonly used for cluster detection in various applications, such as the identification of geographical disease clusters from epidemiological data. It has been pointed out that the method may have difficulty in correctly identifying non-compact, arbitrarily shaped clusters. In this paper, we evaluated the Gini coefficient for detecting irregularly shaped clusters through a simulation study. The Gini coefficient, the use of which in spatial scan statistics was recently proposed, is a criterion measure for optimizing the maximum reported cluster size. Our simulation study results showed that using the Gini coefficient works better than the original spatial scan statistic for identifying irregularly shaped clusters, by reporting an optimized and refined collection of clusters rather than a single larger cluster. We have provided a real data example that seems to support the simulation results. We think that using the Gini coefficient in spatial scan statistics can be helpful for the detection of irregularly shaped clusters. PMID:28129368

  18. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    PubMed

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  19. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model

    PubMed Central

    Austin, Peter C.

    2017-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694

  20. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    PubMed

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  1. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution

    PubMed Central

    Gangnon, Ronald E.

    2011-01-01

    Summary The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, while rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. PMID:21762118

  2. 3D automatic anatomy recognition based on iterative graph-cut-ASM

    NASA Astrophysics Data System (ADS)

    Chen, Xinjian; Udupa, Jayaram K.; Bagci, Ulas; Alavi, Abass; Torigian, Drew A.

    2010-02-01

    We call the computerized assistive process of recognizing, delineating, and quantifying organs and tissue regions in medical imaging, occurring automatically during clinical image interpretation, automatic anatomy recognition (AAR). The AAR system we are developing includes five main parts: model building, object recognition, object delineation, pathology detection, and organ system quantification. In this paper, we focus on the delineation part. For the modeling part, we employ the active shape model (ASM) strategy. For recognition and delineation, we integrate several hybrid strategies of combining purely image based methods with ASM. In this paper, an iterative Graph-Cut ASM (IGCASM) method is proposed for object delineation. An algorithm called GC-ASM was presented at this symposium last year for object delineation in 2D images which attempted to combine synergistically ASM and GC. Here, we extend this method to 3D medical image delineation. The IGCASM method effectively combines the rich statistical shape information embodied in ASM with the globally optimal delineation capability of the GC method. We propose a new GC cost function, which effectively integrates the specific image information with the ASM shape model information. The proposed methods are tested on a clinical abdominal CT data set. The preliminary results show that: (a) it is feasible to explicitly bring prior 3D statistical shape information into the GC framework; (b) the 3D IGCASM delineation method improves on ASM and GC and can provide practical operational time on clinical images.

  3. Simulated performance of an order statistic threshold strategy for detection of narrowband signals

    NASA Technical Reports Server (NTRS)

    Satorius, E.; Brady, R.; Deich, W.; Gulkis, S.; Olsen, E.

    1988-01-01

    The application of order statistics to signal detection is becoming an increasingly active area of research. This is due to the inherent robustness of rank estimators in the presence of large outliers that would significantly degrade more conventional mean-level-based detection systems. A detection strategy is presented in which the threshold estimate is obtained using order statistics. The performance of this algorithm in the presence of simulated interference and broadband noise is evaluated. In this way, the robustness of the proposed strategy in the presence of the interference can be fully assessed as a function of the interference, noise, and detector parameters.

  4. UWB pulse detection and TOA estimation using GLRT

    NASA Astrophysics Data System (ADS)

    Xie, Yan; Janssen, Gerard J. M.; Shakeri, Siavash; Tiberius, Christiaan C. J. M.

    2017-12-01

    In this paper, a novel statistical approach is presented for time-of-arrival (TOA) estimation based on first path (FP) pulse detection using a sub-Nyquist sampling ultra-wide band (UWB) receiver. The TOA measurement accuracy, which cannot be improved by averaging of the received signal, can be enhanced by the statistical processing of a number of TOA measurements. The TOA statistics are modeled and analyzed for a UWB receiver using threshold crossing detection of a pulse signal with noise. The detection and estimation scheme based on the Generalized Likelihood Ratio Test (GLRT) detector, which captures the full statistical information of the measurement data, is shown to achieve accurate TOA estimation and allows for a trade-off between the threshold level, the noise level, the amplitude and the arrival time of the first path pulse, and the accuracy of the obtained final TOA.

  5. Outliers in Questionnaire Data: Can They Be Detected and Should They Be Removed?

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Outliers in questionnaire data are unusual observations, which may bias statistical results, and outlier statistics may be used to detect such outliers. The authors investigated the effect outliers have on the specificity and the sensitivity of each of six different outlier statistics. The Mahalanobis distance and the item-pair based outlier…

  6. Why Are People Bad at Detecting Randomness? A Statistical Argument

    ERIC Educational Resources Information Center

    Williams, Joseph J.; Griffiths, Thomas L.

    2013-01-01

    Errors in detecting randomness are often explained in terms of biases and misconceptions. We propose and provide evidence for an account that characterizes the contribution of the inherent statistical difficulty of the task. Our account is based on a Bayesian statistical analysis, focusing on the fact that a random process is a special case of…

  7. Decisions that Make a Difference in Detecting Differential Item Functioning

    ERIC Educational Resources Information Center

    Sireci, Stephen G.; Rios, Joseph A.

    2013-01-01

    There are numerous statistical procedures for detecting items that function differently across subgroups of examinees that take a test or survey. However, in endeavouring to detect items that may function differentially, selection of the statistical method is only one of many important decisions. In this article, we discuss the important decisions…

  8. Monitoring the soil degradation by Metastatistical Analysis

    NASA Astrophysics Data System (ADS)

    Oleschko, K.; Gaona, C.; Tarquis, A.

    2009-04-01

    The effectiveness of fractal toolbox to capture the critical behavior of soil structural patterns during the chemical and physical degradation was documented by our numerous experiments (Oleschko et al., 2008 a; 2008 b). The spatio-temporal dynamics of these patterns was measured and mapped with high precision in terms of fractal descriptors. All tested fractal techniques were able to detect the statistically significant differences in structure between the perfect spongy and massive patterns of uncultivated and sodium-saline agricultural soils, respectively. For instance, the Hurst exponent, extracted from the Chernozeḿ micromorphological images and from the time series of its physical and mechanical properties measured in situ, detected the roughness decrease (and therefore the increase in H - from 0.17 to 0.30 for images) derived from the loss of original structure complexity. The combined use of different fractal descriptors brings statistical precision into the quantification of natural system degradation and provides a means for objective soil structure comparison (Oleschko et al., 2000). The ability of fractal parameters to capture critical behavior and phase transition was documented for different contrasting situations, including from Andosols deforestation and erosion, to Vertisols high fructuring and consolidation. The Hurst exponent is used to measure the type of persistence and degree of complexity of structure dynamics. We conclude that there is an urgent need to select and adopt a standardized toolbox for fractal analysis and complexity measures in Earth Sciences. We propose to use the second-order (meta-) statistics as subtle measures of complexity (Atmanspacher et al., 1997). The high degree of correlation was documented between the fractal and high-order statistical descriptors (four central moments of stochastic variable distribution) used to the system heterogeneity and variability analysis. We proposed to call this combined fractal/statistical toolbox Metastatistical Analysis and recommend it to the projects directed to soil degradation monitoring. References: 1. Oleschko, K., B.S. Figueroa, M.E. Miranda, M.A. Vuelvas and E.R. Solleiro, Soil & Till. Res. 55, 43 (2000). 2. Oleschko, K., Korvin, G., Figueroa S. B., Vuelvas, M.A., Balankin, A., Flores L., Carreño, D. Fractal radar scattering from soil. Physical Review E.67, 041403, 2003. 3. Zamora-Castro S., Oleschko, K. Flores, L., Ventura, E. Jr., Parrot, J.-F., 2008. Fractal mapping of pore and solids attributes. Vadose Zone Journal, v. 7, Issue2: 473-492. 4. Oleschko, K., Korvin, G., Muñoz, A., Velásquez, J., Miranda, M.E., Carreon, D., Flores, L., Martínez, M., Velásquez-Valle, M., Brambilla, F., Parrot, J.-F. Ronquillo, G., 2008. Fractal mapping of soil moisture content from remote sensed multi-scale data. Nonlinear Proceses in Geophysics Journal, 15: 711-725. 5. Atmanspacher, H., Räth, Ch., Wiedenmann, G., 1997. Statistics and meta-statistics in the concept of complexity. Physica A, 234: 819-829.

  9. Infants with Williams syndrome detect statistical regularities in continuous speech.

    PubMed

    Cashon, Cara H; Ha, Oh-Ryeong; Graf Estes, Katharine; Saffran, Jenny R; Mervis, Carolyn B

    2016-09-01

    Williams syndrome (WS) is a rare genetic disorder associated with delays in language and cognitive development. The reasons for the language delay are unknown. Statistical learning is a domain-general mechanism recruited for early language acquisition. In the present study, we investigated whether infants with WS were able to detect the statistical structure in continuous speech. Eighteen 8- to 20-month-olds with WS were familiarized with 2min of a continuous stream of synthesized nonsense words; the statistical structure of the speech was the only cue to word boundaries. They were tested on their ability to discriminate statistically-defined "words" and "part-words" (which crossed word boundaries) in the artificial language. Despite significant cognitive and language delays, infants with WS were able to detect the statistical regularities in the speech stream. These findings suggest that an inability to track the statistical properties of speech is unlikely to be the primary basis for the delays in the onset of language observed in infants with WS. These results provide the first evidence of statistical learning by infants with developmental delays. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment

    PubMed Central

    Hashim, Mazlan

    2015-01-01

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning. PMID:25898919

  11. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment.

    PubMed

    Shahabi, Himan; Hashim, Mazlan

    2015-04-22

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning.

  12. Multifactor-Dimensionality Reduction Reveals High-Order Interactions among Estrogen-Metabolism Genes in Sporadic Breast Cancer

    PubMed Central

    Ritchie, Marylyn D.; Hahn, Lance W.; Roodi, Nady; Bailey, L. Renee; Dupont, William D.; Parl, Fritz F.; Moore, Jason H.

    2001-01-01

    One of the greatest challenges facing human geneticists is the identification and characterization of susceptibility genes for common complex multifactorial human diseases. This challenge is partly due to the limitations of parametric-statistical methods for detection of gene effects that are dependent solely or partially on interactions with other genes and with environmental exposures. We introduce multifactor-dimensionality reduction (MDR) as a method for reducing the dimensionality of multilocus information, to improve the identification of polymorphism combinations associated with disease risk. The MDR method is nonparametric (i.e., no hypothesis about the value of a statistical parameter is made), is model-free (i.e., it assumes no particular inheritance model), and is directly applicable to case-control and discordant-sib-pair studies. Using simulated case-control data, we demonstrate that MDR has reasonable power to identify interactions among two or more loci in relatively small samples. When it was applied to a sporadic breast cancer case-control data set, in the absence of any statistically significant independent main effects, MDR identified a statistically significant high-order interaction among four polymorphisms from three different estrogen-metabolism genes. To our knowledge, this is the first report of a four-locus interaction associated with a common complex multifactorial disease. PMID:11404819

  13. Statistics for the Relative Detectability of Chemicals in Weak Gaseous Plumes in LWIR Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.

    2008-10-30

    The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less

  14. A statistical retrieval of cloud parameters for the millimeter wave Ice Cloud Imager on board MetOp-SG

    NASA Astrophysics Data System (ADS)

    Prigent, Catherine; Wang, Die; Aires, Filipe; Jimenez, Carlos

    2017-04-01

    The meteorological observations from satellites in the microwave domain are currently limited to below 190 GHz. However, the next generation of European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Polar System-Second Generation-EPS-SG will carry an instrument, the Ice Cloud Imager (ICI), with frequencies up to 664 GHz, to improve the characterization of the cloud frozen phase. In this paper, a statistical retrieval of cloud parameters for ICI is developed, trained on a synthetic database derived from the coupling of a mesoscale cloud model and radiative transfer calculations. The hydrometeor profiles simulated with the Weather Research and Forecasting model (WRF) for twelve diverse European mid-latitude situations are used to simulate the brightness temperatures with the Atmospheric Radiative Transfer Simulator (ARTS) to prepare the retrieval database. The WRF+ARTS simulations have been compared to the Special Sensor Microwave Imager/Sounder (SSMIS) observations up to 190 GHz: this successful evaluation gives us confidence in the simulations at the ICI channels from 183 to 664 GHz. Statistical analyses have been performed on this simulated retrieval database, showing that it is not only physically realistic but also statistically satisfactory for retrieval purposes. A first Neural Network (NN) classifier is used to detect the cloud presence. A second NN is developed to retrieve the liquid and ice integrated cloud quantities over sea and land separately. The detection and retrieval of the hydrometeor quantities (i.e., ice, snow, graupel, rain, and liquid cloud) are performed with ICI-only, and with ICI combined with observations from the MicroWave Imager (MWI, with frequencies from 19 to 190 GHz, also on board MetOp-SG). The ICI channels have been optimized for the detection and quantification of the cloud frozen phases: adding the MWI channels improves the performance of the vertically integrated hydrometeor contents, especially for the cloud liquid phases. The relative error for the retrieved integrated frozen water content (FWP, i.e., ice+snow+graupel) is below 40% for 0.1kg/m2 < FWP < 0.5kg/m2 and below 20% for FWP > 0.5 kg/m2.

  15. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system

    PubMed Central

    Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J.; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method’s implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System’s C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis. PMID:28886112

  16. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system.

    PubMed

    Mathes, Robert W; Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method's implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System's C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis.

  17. A combined use of multispectral and SAR images for ship detection and characterization through object based image analysis

    NASA Astrophysics Data System (ADS)

    Aiello, Martina; Gianinetto, Marco

    2017-10-01

    Marine routes represent a huge portion of commercial and human trades, therefore surveillance, security and environmental protection themes are gaining increasing importance. Being able to overcome the limits imposed by terrestrial means of monitoring, ship detection from satellite has recently prompted a renewed interest for a continuous monitoring of illegal activities. This paper describes an automatic Object Based Image Analysis (OBIA) approach to detect vessels made of different materials in various sea environments. The combined use of multispectral and SAR images allows for a regular observation unrestricted by lighting and atmospheric conditions and complementarity in terms of geographic coverage and geometric detail. The method developed adopts a region growing algorithm to segment the image in homogeneous objects, which are then classified through a decision tree algorithm based on spectral and geometrical properties. Then, a spatial analysis retrieves the vessels' position, length and heading parameters and a speed range is associated. Optimization of the image processing chain is performed by selecting image tiles through a statistical index. Vessel candidates are detected over amplitude SAR images using an adaptive threshold Constant False Alarm Rate (CFAR) algorithm prior the object based analysis. Validation is carried out by comparing the retrieved parameters with the information provided by the Automatic Identification System (AIS), when available, or with manual measurement when AIS data are not available. The estimation of length shows R2=0.85 and estimation of heading R2=0.92, computed as the average of R2 values obtained for both optical and radar images.

  18. Age-standardisation when target setting and auditing performance of Down syndrome screening programmes.

    PubMed

    Cuckle, Howard; Aitken, David; Goodburn, Sandra; Senior, Brian; Spencer, Kevin; Standing, Sue

    2004-11-01

    To describe and illustrate a method of setting Down syndrome screening targets and auditing performance that allows for differences in the maternal age distribution. A reference population was determined from a Gaussian model of maternal age. Target detection and false-positive rates were determined by standard statistical modelling techniques, except that the reference population rather than an observed population was used. Second-trimester marker parameters were obtained for Down syndrome from a large meta-analysis, and for unaffected pregnancies from the combined results of more than 600,000 screens in five centres. Audited detection and false-positive rates were the weighted average of the rates in five broad age groups corrected for viability bias. Weights were based on the age distributions in the reference population. Maternal age was found to approximate reasonably well to a Gaussian distribution with mean 27 years and standard deviation 5.5 years. Depending on marker combination, the target detection rates were 59 to 64% and false-positive rate 4.2 to 5.4% for a 1 in 250 term cut-off; 65 to 68% and 6.1 to 7.3% for 1 in 270 at mid-trimester. Among the five centres, the audited detection rate ranged from 7% below target to 10% above target, with audited false-positive rates better than the target by 0.3 to 1.5%. Age-standardisation should help to improve screening quality by allowing for intrinsic differences between programmes, so that valid comparisons can be made. Copyright 2004 John Wiley & Sons, Ltd.

  19. New methods for the condition monitoring of level crossings

    NASA Astrophysics Data System (ADS)

    García Márquez, Fausto Pedro; Pedregal, Diego J.; Roberts, Clive

    2015-04-01

    Level crossings represent a high risk for railway systems. This paper demonstrates the potential to improve maintenance management through the use of intelligent condition monitoring coupled with reliability centred maintenance (RCM). RCM combines advanced electronics, control, computing and communication technologies to address the multiple objectives of cost effectiveness, improved quality, reliability and services. RCM collects digital and analogue signals utilising distributed transducers connected to either point-to-point or digital bus communication links. Assets in many industries use data logging capable of providing post-failure diagnostic support, but to date little use has been made of combined qualitative and quantitative fault detection techniques. The research takes the hydraulic railway level crossing barrier (LCB) system as a case study and develops a generic strategy for failure analysis, data acquisition and incipient fault detection. For each barrier the hydraulic characteristics, the motor's current and voltage, hydraulic pressure and the barrier's position are acquired. In order to acquire the data at a central point efficiently, without errors, a distributed single-cable Fieldbus is utilised. This allows the connection of all sensors through the project's proprietary communication nodes to a high-speed bus. The system developed in this paper for the condition monitoring described above detects faults by means of comparing what can be considered a 'normal' or 'expected' shape of a signal with respect to the actual shape observed as new data become available. ARIMA (autoregressive integrated moving average) models were employed for detecting faults. The statistical tests known as Jarque-Bera and Ljung-Box have been considered for testing the model.

  20. Detecting drug-drug interactions using a database for spontaneous adverse drug reactions: an example with diuretics and non-steroidal anti-inflammatory drugs.

    PubMed

    van Puijenbroek, E P; Egberts, A C; Heerdink, E R; Leufkens, H G

    2000-12-01

    Drug-drug interactions are relatively rarely reported to spontaneous reporting systems (SRSs) for adverse drug reactions. For this reason, the traditional approach for analysing SRS has major limitations for the detection of drug-drug interactions. We developed a method that may enable signalling of these possible interactions, which are often not explicitly reported, utilising reports of adverse drug reactions in data sets of SRS. As an example, the influence of concomitant use of diuretics and non-steroidal anti-inflammatory drugs (NSAIDs) on symptoms indicating a decreased efficacy of diuretics was examined using reports received by the Netherlands Pharmacovigilance Foundation Lareb. Reports received between 1 January 1990 and 1 January 1999 of patients older than 50 years were included in the study. Cases were defined as reports with symptoms indicating a decreased efficacy of diuretics, non-cases as all other reports. Exposure categories were the use of NSAIDs or diuretics versus the use of neither of these drugs. The influence of the combined use of both drugs was examined using logistic regression analysis. The odds ratio of the statistical interaction term of the combined use of both drugs was increased [adjusted odds ratio 2.0, 95% confidence interval (CI) 1.1-3.7], which may indicate an enhanced effect of concomitant drug use. The findings illustrate that spontaneous reporting systems have a potential for signal detection and the analysis of possible drug-drug interactions. The method described may enable a more active approach in the detection of drug-drug interactions after marketing.

  1. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less

  2. Observation of Communication by Physical Education Teachers: Detecting Patterns in Verbal Behavior

    PubMed Central

    García-Fariña, Abraham; Jiménez-Jiménez, F.; Anguera, M. Teresa

    2018-01-01

    The aim of this study was to analyze the verbal behavior of primary school physical education teachers in a natural classroom setting in order to investigate patterns in social constructivist communication strategies before and after participation in a training program designed to familiarize teachers with these strategies. The participants were three experienced physical education teachers interacting separately with 65 students over a series of classes. Written informed consent was obtained from all the students' parents or legal guardians. An indirect observation tool (ADDEF) was designed specifically for the study within the theoretical framework, and consisted of a combined field format, with three dimensions, and category systems. Each dimension formed the basis for building a subsequent system of exhaustive and mutually exclusive categories. Twenty-nine sessions, grouped into two separate modules, were coded using the Atlas.ti 7 program, and a total of 1991 units (messages containing constructivist discursive strategies) were recorded. Analysis of intraobserver reliability showed almost perfect agreement. Lag sequential analysis, which is a powerful statistical technique based on the calculation of conditional and unconditional probabilities in prospective and retrospective lags, was performed in GSEQ5 software to search for verbal behavior patterns before and after the training program. At both time points, we detected a pattern formed by requests for information combined with the incorporation of students' contributions into the teachers' discourse and re-elaborations of answers. In the post-training phase, we detected new and stronger patterns in certain sessions, indicating that programs combining theoretical and practical knowledge can effectively increase teachers' repertoire of discursive strategies and ultimately promote active engagement in learning. This has important implications for the evaluation and development of teacher effectiveness in practice and formal education programs. PMID:29615943

  3. Proteomic biomarkers apolipoprotein A1, truncated transthyretin and connective tissue activating protein III enhance the sensitivity of CA125 for detecting early stage epithelial ovarian cancer.

    PubMed

    Clarke, Charlotte H; Yip, Christine; Badgwell, Donna; Fung, Eric T; Coombes, Kevin R; Zhang, Zhen; Lu, Karen H; Bast, Robert C

    2011-09-01

    The low prevalence of ovarian cancer demands both high sensitivity (>75%) and specificity (99.6%) to achieve a positive predictive value of 10% for successful early detection. Utilizing a two stage strategy where serum marker(s) prompt the performance of transvaginal sonography (TVS) in a limited number (2%) of women could reduce the requisite specificity for serum markers to 98%. We have attempted to improve sensitivity by combining CA125 with proteomic markers. Sera from 41 patients with early stage (I/II) and 51 with late stage (III/IV) epithelial ovarian cancer, 40 with benign disease and 99 healthy individuals, were analyzed to measure 7 proteins [Apolipoprotein A1 (Apo-A1), truncated transthyretin (TT), transferrin, hepcidin, ß-2-microglobulin (ß2M), Connective Tissue Activating Protein III (CTAPIII), and Inter-alpha-trypsin inhibitor heavy chain 4 (ITIH4)]. Statistical models were fit by logistic regression, followed by optimization of factors retained in the models determined by optimizing the Akaike Information Criterion. A validation set included 136 stage I ovarian cancers, 140 benign pelvic masses and 174 healthy controls. In a training set analysis, the 3 most effective biomarkers (Apo-A1, TT and CTAPIII) exhibited 54% sensitivity at 98% specificity, CA125 alone produced 68% sensitivity and the combination increased sensitivity to 88%. In a validation set, the marker panel plus CA125 produced a sensitivity of 84% at 98% specificity (P=0.015, McNemar's test). Combining a panel of proteomic markers with CA125 could provide a first step in a sequential two-stage strategy with TVS for early detection of ovarian cancer. Copyright © 2011. Published by Elsevier Inc.

  4. Proteomic Biomarkers Apolipoprotein A1, Truncated Transthyretin and Connective Tissue Activating Protein III Enhance the Sensitivity of CA125 for Detecting Early Stage Epithelial Ovarian Cancer

    PubMed Central

    Clarke, Charlotte H.; Yip, Christine; Badgwell, Donna; Fung, Eric T.; Coombes, Kevin R.; Zhang, Zhen; Lu, Karen H.; Bast, Robert C.

    2011-01-01

    Objective The low prevalence of ovarian cancer demands both high sensitivity (>75%) and specificity (99.6%) to achieve a positive predictive value of 10% for successful early detection. Utilizing a two stage strategy where serum marker(s) prompt the performance of transvaginal sonography (TVS) in a limited number (2%) of women could reduce the requisite specificity for serum markers to 98%. We have attempted to improve sensitivity by combining CA125 with proteomic markers. Methods Sera from 41 patients with early stage (I/II) and 51 with late stage (III/IV) epithelial ovarian cancer, 40 with benign disease and 99 healthy individuals, were analyzed to measure 7 proteins [Apolipoprotein A1 (Apo-A1), truncated transthyretin (TT), transferrin, hepcidin, ß-2-microglobulin (ß2M), Connective Tissue Activating Protein III (CTAPIII), and Inter-alpha-trypsin inhibitor heavy chain 4 (ITIH4)]. Statistical models were fit by logistic regression, followed by optimization of factors retained in the models determined by optimizing the Akaike Information Criterion. A validation set included 136 stage I ovarian cancers, 140 benign pelvic masses and 174 healthy controls. Results In a training set analysis, the 3 most effective biomarkers (Apo-A1, TT and CTAPIII) exhibited 54% sensitivity at 98% specificity, CA125 alone produced 68% sensitivity and the combination increased sensitivity to 88%. In a validation set, the marker panel plus CA125 produced a sensitivity of 84% at 98% specificity (P= 0.015, McNemar's test). Conclusion Combining a panel of proteomic markers with CA125 could provide a first step in a sequential two-stage strategy with TVS for early detection of ovarian cancer. PMID:21708402

  5. Frequent detection of a human fecal indicator in the urban ocean: environmental drivers and covariation with enterococci.

    PubMed

    Jennings, Wiley C; Chern, Eunice C; O'Donohue, Diane; Kellogg, Michael G; Boehm, Alexandria B

    2018-03-01

    Fecal pollution of surface waters presents a global human health threat. New molecular indicators of fecal pollution have been developed to address shortcomings of traditional culturable fecal indicators. However, there is still little information on their fate and transport in the environment. The present study uses spatially and temporally extensive data on traditional (culturable enterococci, cENT) and molecular (qPCR-enterococci, qENT and human-associated marker, HF183/BacR287) indicator concentrations in marine water surrounding highly-urbanized San Francisco, California, USA to investigate environmental and anthropogenic processes that impact fecal pollution. We constructed multivariable regression models for fecal indicator bacteria at 14 sampling stations. The human marker was detected more frequently in our study than in many other published studies, with detection frequency at some stations as high as 97%. The odds of cENT, qENT, and HF183/BacR287 exceeding health-relevant thresholds were statistically elevated immediately following discharges of partially treated combined sewage, and cENT levels dissipated after approximately 1 day. However, combined sewer discharges were not important predictors of indicator levels typically measured in weekly monitoring samples. Instead, precipitation and solar insolation were important predictors of cENT in weekly samples, while precipitation and water temperature were important predictors of HF183/BacR287 and qENT. The importance of precipitation highlights the significance of untreated storm water as a source of fecal pollution to the urban ocean, even for a city served by a combined sewage system. Sunlight and water temperature likely control persistence of the indicators via photoinactivation and dark decay processes, respectively.

  6. Spatial analysis improves the detection of early corneal nerve fiber loss in patients with recently diagnosed type 2 diabetes

    PubMed Central

    Winter, Karsten; Strom, Alexander; Zhivov, Andrey; Allgeier, Stephan; Papanas, Nikolaos; Ziegler, Iris; Brüggemann, Jutta; Ringel, Bernd; Peschel, Sabine; Köhler, Bernd; Stachs, Oliver; Guthoff, Rudolf F.; Roden, Michael

    2017-01-01

    Corneal confocal microscopy (CCM) has revealed reduced corneal nerve fiber (CNF) length and density (CNFL, CNFD) in patients with diabetes, but the spatial pattern of CNF loss has not been studied. We aimed to determine whether spatial analysis of the distribution of corneal nerve branching points (CNBPs) may contribute to improving the detection of early CNF loss. We hypothesized that early CNF decline follows a clustered rather than random distribution pattern of CNBPs. CCM, nerve conduction studies (NCS), and quantitative sensory testing (QST) were performed in a cross-sectional study including 86 patients recently diagnosed with type 2 diabetes and 47 control subjects. In addition to CNFL, CNFD, and branch density (CNBD), CNBPs were analyzed using spatial point pattern analysis (SPPA) including 10 indices and functional statistics. Compared to controls, patients with diabetes showed lower CNBP density and higher nearest neighbor distances, and all SPPA parameters indicated increased clustering of CNBPs (all P<0.05). SPPA parameters were abnormally increased >97.5th percentile of controls in up to 23.5% of patients. When combining an individual SPPA parameter with CNFL, ≥1 of 2 indices were >99th or <1st percentile of controls in 28.6% of patients compared to 2.1% of controls, while for the conventional CNFL/CNFD/CNBD combination the corresponding rates were 16.3% vs 2.1%. SPPA parameters correlated with CNFL and several NCS and QST indices in the controls (all P<0.001), whereas in patients with diabetes these correlations were markedly weaker or lost. In conclusion, SPPA reveals increased clustering of early CNF loss and substantially improves its detection when combined with a conventional CCM measure in patients with recently diagnosed type 2 diabetes. PMID:28296936

  7. Observation of Communication by Physical Education Teachers: Detecting Patterns in Verbal Behavior.

    PubMed

    García-Fariña, Abraham; Jiménez-Jiménez, F; Anguera, M Teresa

    2018-01-01

    The aim of this study was to analyze the verbal behavior of primary school physical education teachers in a natural classroom setting in order to investigate patterns in social constructivist communication strategies before and after participation in a training program designed to familiarize teachers with these strategies. The participants were three experienced physical education teachers interacting separately with 65 students over a series of classes. Written informed consent was obtained from all the students' parents or legal guardians. An indirect observation tool (ADDEF) was designed specifically for the study within the theoretical framework, and consisted of a combined field format, with three dimensions, and category systems. Each dimension formed the basis for building a subsequent system of exhaustive and mutually exclusive categories. Twenty-nine sessions, grouped into two separate modules, were coded using the Atlas.ti 7 program, and a total of 1991 units (messages containing constructivist discursive strategies) were recorded. Analysis of intraobserver reliability showed almost perfect agreement. Lag sequential analysis, which is a powerful statistical technique based on the calculation of conditional and unconditional probabilities in prospective and retrospective lags, was performed in GSEQ5 software to search for verbal behavior patterns before and after the training program. At both time points, we detected a pattern formed by requests for information combined with the incorporation of students' contributions into the teachers' discourse and re-elaborations of answers. In the post-training phase, we detected new and stronger patterns in certain sessions, indicating that programs combining theoretical and practical knowledge can effectively increase teachers' repertoire of discursive strategies and ultimately promote active engagement in learning. This has important implications for the evaluation and development of teacher effectiveness in practice and formal education programs.

  8. Matching mammographic regions in mediolateral oblique and cranio caudal views: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Samulski, Maurice; Karssemeijer, Nico

    2008-03-01

    Most of the current CAD systems detect suspicious mass regions independently in single views. In this paper we present a method to match corresponding regions in mediolateral oblique (MLO) and craniocaudal (CC) mammographic views of the breast. For every possible combination of mass regions in the MLO view and CC view, a number of features are computed, such as the difference in distance of a region to the nipple, a texture similarity measure, the gray scale correlation and the likelihood of malignancy of both regions computed by single-view analysis. In previous research, Linear Discriminant Analysis was used to discriminate between correct and incorrect links. In this paper we investigate if the performance can be improved by employing a statistical method in which four classes are distinguished. These four classes are defined by the combinations of view (MLO/CC) and pathology (TP/FP) labels. We use distance-weighted k-Nearest Neighbor density estimation to estimate the likelihood of a region combination. Next, a correspondence score is calculated as the likelihood that the region combination is a TP-TP link. The method was tested on 412 cases with a malignant lesion visible in at least one of the views. In 82.4% of the cases a correct link could be established between the TP detections in both views. In future work, we will use the framework presented here to develop a context dependent region matching scheme, which takes the number and likelihood of possible alternatives into account. It is expected that more accurate determination of matching probabilities will lead to improved CAD performance.

  9. REANALYSIS OF F-STATISTIC GRAVITATIONAL-WAVE SEARCHES WITH THE HIGHER CRITICISM STATISTIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, M. F.; Melatos, A.; Delaigle, A.

    2013-04-01

    We propose a new method of gravitational-wave detection using a modified form of higher criticism, a statistical technique introduced by Donoho and Jin. Higher criticism is designed to detect a group of sparse, weak sources, none of which are strong enough to be reliably estimated or detected individually. We apply higher criticism as a second-pass method to synthetic F-statistic and C-statistic data for a monochromatic periodic source in a binary system and quantify the improvement relative to the first-pass methods. We find that higher criticism on C-statistic data is more sensitive by {approx}6% than the C-statistic alone under optimal conditionsmore » (i.e., binary orbit known exactly) and the relative advantage increases as the error in the orbital parameters increases. Higher criticism is robust even when the source is not monochromatic (e.g., phase-wandering in an accreting system). Applying higher criticism to a phase-wandering source over multiple time intervals gives a {approx}> 30% increase in detectability with few assumptions about the frequency evolution. By contrast, in all-sky searches for unknown periodic sources, which are dominated by the brightest source, second-pass higher criticism does not provide any benefits over a first-pass search.« less

  10. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  11. Principal component directed partial least squares analysis for combining nuclear magnetic resonance and mass spectrometry data in metabolomics: application to the detection of breast cancer.

    PubMed

    Gu, Haiwei; Pan, Zhengzheng; Xi, Bowei; Asiago, Vincent; Musselman, Brian; Raftery, Daniel

    2011-02-07

    Nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry (MS) are the two most commonly used analytical tools in metabolomics, and their complementary nature makes the combination particularly attractive. A combined analytical approach can improve the potential for providing reliable methods to detect metabolic profile alterations in biofluids or tissues caused by disease, toxicity, etc. In this paper, (1)H NMR spectroscopy and direct analysis in real time (DART)-MS were used for the metabolomics analysis of serum samples from breast cancer patients and healthy controls. Principal component analysis (PCA) of the NMR data showed that the first principal component (PC1) scores could be used to separate cancer from normal samples. However, no such obvious clustering could be observed in the PCA score plot of DART-MS data, even though DART-MS can provide a rich and informative metabolic profile. Using a modified multivariate statistical approach, the DART-MS data were then reevaluated by orthogonal signal correction (OSC) pretreated partial least squares (PLS), in which the Y matrix in the regression was set to the PC1 score values from the NMR data analysis. This approach, and a similar one using the first latent variable from PLS-DA of the NMR data resulted in a significant improvement of the separation between the disease samples and normals, and a metabolic profile related to breast cancer could be extracted from DART-MS. The new approach allows the disease classification to be expressed on a continuum as opposed to a binary scale and thus better represents the disease and healthy classifications. An improved metabolic profile obtained by combining MS and NMR by this approach may be useful to achieve more accurate disease detection and gain more insight regarding disease mechanisms and biology. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. COPS: Detecting Co-Occurrence and Spatial Arrangement of Transcription Factor Binding Motifs in Genome-Wide Datasets

    PubMed Central

    Lohmann, Ingrid

    2012-01-01

    In multi-cellular organisms, spatiotemporal activity of cis-regulatory DNA elements depends on their occupancy by different transcription factors (TFs). In recent years, genome-wide ChIP-on-Chip, ChIP-Seq and DamID assays have been extensively used to unravel the combinatorial interaction of TFs with cis-regulatory modules (CRMs) in the genome. Even though genome-wide binding profiles are increasingly becoming available for different TFs, single TF binding profiles are in most cases not sufficient for dissecting complex regulatory networks. Thus, potent computational tools detecting statistically significant and biologically relevant TF-motif co-occurrences in genome-wide datasets are essential for analyzing context-dependent transcriptional regulation. We have developed COPS (Co-Occurrence Pattern Search), a new bioinformatics tool based on a combination of association rules and Markov chain models, which detects co-occurring TF binding sites (BSs) on genomic regions of interest. COPS scans DNA sequences for frequent motif patterns using a Frequent-Pattern tree based data mining approach, which allows efficient performance of the software with respect to both data structure and implementation speed, in particular when mining large datasets. Since transcriptional gene regulation very often relies on the formation of regulatory protein complexes mediated by closely adjoining TF binding sites on CRMs, COPS additionally detects preferred short distance between co-occurring TF motifs. The performance of our software with respect to biological significance was evaluated using three published datasets containing genomic regions that are independently bound by several TFs involved in a defined biological process. In sum, COPS is a fast, efficient and user-friendly tool mining statistically and biologically significant TFBS co-occurrences and therefore allows the identification of TFs that combinatorially regulate gene expression. PMID:23272209

  13. A Novel Application for the Detection of an Irregular Pulse using an iPhone 4S in Patients with Atrial Fibrillation

    PubMed Central

    McManus, David D.; Lee, Jinseok; Maitas, Oscar; Esa, Nada; Pidikiti, Rahul; Carlucci, Alex; Harrington, Josephine; Mick, Eric; Chon, Ki H.

    2012-01-01

    Background Atrial fibrillation (AF) is common and associated with adverse health outcomes. Timely detection of AF can be challenging using traditional diagnostic tools. Smartphone use is increasing and may provide an inexpensive and user-friendly means to diagnose AF. Objective To test the hypothesis that a smartphone-based application could detect an irregular pulse from AF. Methods 76 adults with persistent AF were consented for participation in our study. We obtained pulsatile time series recordings before and after cardioversion using an iPhone 4S camera. A novel smartphone application conducted real-time pulse analysis using 2 statistical methods [Root Mean Square of Successive RR Differences (RMSSD/mean); Shannon Entropy (ShE)]. We examined the sensitivity, specificity, and predictive accuracy of both algorithms using the 12-lead electrocardiogram as the gold standard. Results RMSDD/mean and ShE were higher in participants in AF compared with sinus rhythm. The 2 methods were inversely related to AF in regression models adjusting for key factors including heart rate and blood pressure (β coefficients per SD-increment in RMSDD/mean and ShE were −0.20 and −0.35; p<0.001). An algorithm combining the 2 statistical methods demonstrated excellent sensitivity (0.962), specificity (0.975), and accuracy (0.968) for beat-to-beat discrimination of an irregular pulse during AF from sinus rhythm. Conclusions In a prospectively recruited cohort of 76 participants undergoing cardioversion for AF, we found that a novel algorithm analyzing signals recorded using an iPhone 4S accurately distinguished pulse recordings during AF from sinus rhythm. Data are needed to explore the performance and acceptability of smartphone-based applications for AF detection. PMID:23220686

  14. A novel application for the detection of an irregular pulse using an iPhone 4S in patients with atrial fibrillation.

    PubMed

    McManus, David D; Lee, Jinseok; Maitas, Oscar; Esa, Nada; Pidikiti, Rahul; Carlucci, Alex; Harrington, Josephine; Mick, Eric; Chon, Ki H

    2013-03-01

    Atrial fibrillation (AF) is common and associated with adverse health outcomes. Timely detection of AF can be challenging using traditional diagnostic tools. Smartphone use is increasing and may provide an inexpensive and user-friendly means to diagnoseAF. To test the hypothesis that a smartphone-based application could detect an irregular pulse fromAF. Seventy-six adults with persistent AF were consented for participation in our study. We obtained pulsatile time series recordings before and after cardioversion using an iPhone 4S camera. A novel smartphone application conducted real-time pulse analysis using 2 statistical methods: root mean square of successive RR difference (RMSSD/mean) and Shannon entropy (ShE). We examined the sensitivity, specificity, and predictive accuracy of both algorithms using the 12-lead electrocardiogram as the gold standard. RMSDD/mean and ShE were higher in participants in AF than in those with sinus rhythm. The 2 methods were inversely related to AF in regression models adjusting for key factors including heart rate and blood pressure (beta coefficients per SD increment in RMSDD/mean and ShE were-0.20 and-0.35; P<.001). An algorithm combining the 2 statistical methods demonstrated excellent sensitivity (0.962), specificity (0.975), and accuracy (0.968) for beat-to-beat discrimination of an irregular pulse during AF from sinus rhythm. In a prospectively recruited cohort of 76 participants undergoing cardioversion for AF, we found that a novel algorithm analyzing signals recorded using an iPhone 4S accurately distinguished pulse recordings during AF from sinus rhythm. Data are needed to explore the performance and acceptability of smartphone-based applications for AF detection. Copyright © 2013 Heart Rhythm Society. All rights reserved.

  15. Layer-by-Layer Polyelectrolyte Encapsulation of Mycoplasma pneumoniae for Enhanced Raman Detection

    PubMed Central

    Rivera-Betancourt, Omar E.; Sheppard, Edward S.; Krause, Duncan C.; Dluhy, Richard A.

    2014-01-01

    Mycoplasma pneumoniae is a major cause of respiratory disease in humans and accounts for as much as 20% of all community-acquired pneumonia. Existing mycoplasma diagnosis is primarily limited by the poor success rate at culturing the bacteria from clinical samples. There is a critical need to develop a new platform for mycoplasma detection that has high sensitivity, specificity, and expediency. Here we report the layer-by-layer (LBL) encapsulation of M. pneumoniae cells with Ag nanoparticles in a matrix of the polyelectrolytes poly(allylamine hydrochloride) (PAH) and poly(styrene sulfonate) (PSS). We evaluated nanoparticle encapsulated mycoplasma cells as a platform for the differentiation of M. pneumoniae strains using surface enhanced Raman scattering (SERS) combined with multivariate statistical analysis. Three separate M. pneumoniae strains (M129, FH and II-3) were studied. Scanning electron microscopy and fluorescence imaging showed that the Ag nanoparticles were incorporated between the oppositely charged polyelectrolyte layers. SERS spectra showed that LBL encapsulation provides excellent spectral reproducibility. Multivariate statistical analysis of the Raman spectra differentiated the three M. pneumoniae strains with 97 – 100% specificity and sensitivity, and low (0.1 – 0.4) root mean square error. These results indicated that nanoparticle and polyelectrolyte encapsulation of M. pneumoniae is a potentially powerful platform for rapid and sensitive SERS-based bacterial identification. PMID:25017005

  16. Improving regional influenza surveillance through a combination of automated outbreak detection methods: the 2015/16 season in France.

    PubMed

    Pelat, Camille; Bonmarin, Isabelle; Ruello, Marc; Fouillet, Anne; Caserio-Schönemann, Céline; Levy-Bruhl, Daniel; Le Strat, Yann

    2017-08-10

    The 2014/15 influenza epidemic caused a work overload for healthcare facilities in France. The French national public health agency announced the start of the epidemic - based on indicators aggregated at the national level - too late for many hospitals to prepare. It was therefore decided to improve the influenza alert procedure through (i) the introduction of a pre-epidemic alert level to better anticipate future outbreaks, (ii) the regionalisation of surveillance so that healthcare structures can be informed of the arrival of epidemics in their region, (iii) the standardised use of data sources and statistical methods across regions. A web application was developed to deliver statistical results of three outbreak detection methods applied to three surveillance data sources: emergency departments, emergency general practitioners and sentinel general practitioners. This application was used throughout the 2015/16 influenza season by the epidemiologists of the headquarters and regional units of the French national public health agency. It allowed them to signal the first influenza epidemic alert in week 2016-W03, in Brittany, with 11 other regions in pre-epidemic alert. This application received positive feedback from users and was pivotal for coordinating surveillance across the agency's regional units. This article is copyright of The Authors, 2017.

  17. Soil genotoxicity assessment: a new stategy based on biomolecular tools and plant bioindicators.

    PubMed

    Citterio, Sandra; Aina, Roberta; Labra, Massimo; Ghiani, Alessandra; Fumagalli, Pietro; Sgorbati, Sergio; Santagostino, Angela

    2002-06-15

    The setting up of efficient early warning systems is a challenge to research for preventing environmental alteration and human disease. In this paper, we report the development and the field application of a new biomonitoring methodology for assessing soil genotoxicity. In the first part, the use of amplified fragment length polymorphism and flow cytometry techniques to detect DNA damage induced by soils artificially contaminated with heavy metals as potentially genotoxic compounds is explained. Results show that the combination of the two techniques leads to efficient detection of the sublethal genotoxic effect induced in the plant bioindicator by contaminated soil. By contrast, the classic mortality, root, and shoot growth vegetative endpoints prove inappropriate for assessing soil genotoxicity because, although they cause genotoxic damage, some heavy metals do not affect sentinel plant development negatively. The statistical elaboration of the data obtained led to the development of a statistical predictive model which differentiates four different levels of soil genotoxic pollution and can be used everywhere. The second part deals with the application of the biomonitoring protocol in the genotoxic assessment of two areas surrounding a steelworks in northern Italy and the effectiveness of this methodology. In this particular case, in these areas, the predictive model reveals a pollution level strictly correlated to the heavy metal concentrations revealed by traditional chemical analysis.

  18. Nuclear patterns of human breast cancer cells during apoptosis: characterisation by fractal dimension and co-occurrence matrix statistics.

    PubMed

    Losa, Gabriele A; Castelli, Christian

    2005-11-01

    An analytical strategy combining fractal geometry and grey-level co-occurrence matrix (GLCM) statistics was devised to investigate ultrastructural changes in oestrogen-insensitive SK-BR3 human breast cancer cells undergoing apoptosis in vitro. Apoptosis was induced by 1 microM calcimycin (A23187 Ca(2+) ionophore) and assessed by measuring conventional cellular parameters during the culture period. SK-BR3 cells entered the early stage of apoptosis within 24 h of treatment with calcimycin, which induced detectable changes in nuclear components, as documented by increased values of most GLCM parameters and by the general reduction of the fractal dimensions. In these affected cells, morphonuclear traits were accompanied by the reduction of distinct gangliosides and loss of unidentifiable glycolipid molecules at the cell surface. All these changes were shown to be involved in apoptosis before the detection of conventional markers, which were only measurable during the active phases of apoptotic cell death. In overtly apoptotic cells treated with 1 microM calcimycin for 72 h, most nuclear components underwent dramatic ultrastructural changes, including marginalisation and condensation of chromatin, as reflected in a significant reduction of their fractal dimensions. Hence, both fractal and GLCM analyses confirm that the morphological reorganisation of nuclei, attributable to a loss of structural complexity, occurs early in apoptosis.

  19. An image adaptive, wavelet-based watermarking of digital images

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Andaloro, Guido; Prestipino, Daniela; Puccio, Luigia

    2007-12-01

    In digital management, multimedia content and data can easily be used in an illegal way--being copied, modified and distributed again. Copyright protection, intellectual and material rights protection for authors, owners, buyers, distributors and the authenticity of content are crucial factors in solving an urgent and real problem. In such scenario digital watermark techniques are emerging as a valid solution. In this paper, we describe an algorithm--called WM2.0--for an invisible watermark: private, strong, wavelet-based and developed for digital images protection and authenticity. Using discrete wavelet transform (DWT) is motivated by good time-frequency features and well-matching with human visual system directives. These two combined elements are important in building an invisible and robust watermark. WM2.0 works on a dual scheme: watermark embedding and watermark detection. The watermark is embedded into high frequency DWT components of a specific sub-image and it is calculated in correlation with the image features and statistic properties. Watermark detection applies a re-synchronization between the original and watermarked image. The correlation between the watermarked DWT coefficients and the watermark signal is calculated according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has shown to be resistant against geometric, filtering and StirMark attacks with a low rate of false alarm.

  20. Shrinkage covariance matrix approach based on robust trimmed mean in gene sets detection

    NASA Astrophysics Data System (ADS)

    Karjanto, Suryaefiza; Ramli, Norazan Mohamed; Ghani, Nor Azura Md; Aripin, Rasimah; Yusop, Noorezatty Mohd

    2015-02-01

    Microarray involves of placing an orderly arrangement of thousands of gene sequences in a grid on a suitable surface. The technology has made a novelty discovery since its development and obtained an increasing attention among researchers. The widespread of microarray technology is largely due to its ability to perform simultaneous analysis of thousands of genes in a massively parallel manner in one experiment. Hence, it provides valuable knowledge on gene interaction and function. The microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints. Therefore, the sample covariance matrix in Hotelling's T2 statistic is not positive definite and become singular, thus it cannot be inverted. In this research, the Hotelling's T2 statistic is combined with a shrinkage approach as an alternative estimation to estimate the covariance matrix to detect significant gene sets. The use of shrinkage covariance matrix overcomes the singularity problem by converting an unbiased to an improved biased estimator of covariance matrix. Robust trimmed mean is integrated into the shrinkage matrix to reduce the influence of outliers and consequently increases its efficiency. The performance of the proposed method is measured using several simulation designs. The results are expected to outperform existing techniques in many tested conditions.

  1. Information gains from cosmic microwave background experiments

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Amara, Adam; Refregier, Alexandre; Paranjape, Aseem; Akeret, Joël

    2014-07-01

    To shed light on the fundamental problems posed by dark energy and dark matter, a large number of experiments have been performed and combined to constrain cosmological models. We propose a novel way of quantifying the information gained by updates on the parameter constraints from a series of experiments which can either complement earlier measurements or replace them. For this purpose, we use the Kullback-Leibler divergence or relative entropy from information theory to measure differences in the posterior distributions in model parameter space from a pair of experiments. We apply this formalism to a historical series of cosmic microwave background experiments ranging from Boomerang to WMAP, SPT, and Planck. Considering different combinations of these experiments, we thus estimate the information gain in units of bits and distinguish contributions from the reduction of statistical errors and the "surprise" corresponding to a significant shift of the parameters' central values. For this experiment series, we find individual relative entropy gains ranging from about 1 to 30 bits. In some cases, e.g. when comparing WMAP and Planck results, we find that the gains are dominated by the surprise rather than by improvements in statistical precision. We discuss how this technique provides a useful tool for both quantifying the constraining power of data from cosmological probes and detecting the tensions between experiments.

  2. Fault Diagnosis Strategies for SOFC-Based Power Generation Plants

    PubMed Central

    Costamagna, Paola; De Giorgi, Andrea; Gotelli, Alberto; Magistri, Loredana; Moser, Gabriele; Sciaccaluga, Emanuele; Trucco, Andrea

    2016-01-01

    The success of distributed power generation by plants based on solid oxide fuel cells (SOFCs) is hindered by reliability problems that can be mitigated through an effective fault detection and isolation (FDI) system. However, the numerous operating conditions under which such plants can operate and the random size of the possible faults make identifying damaged plant components starting from the physical variables measured in the plant very difficult. In this context, we assess two classical FDI strategies (model-based with fault signature matrix and data-driven with statistical classification) and the combination of them. For this assessment, a quantitative model of the SOFC-based plant, which is able to simulate regular and faulty conditions, is used. Moreover, a hybrid approach based on the random forest (RF) classification method is introduced to address the discrimination of regular and faulty situations due to its practical advantages. Working with a common dataset, the FDI performances obtained using the aforementioned strategies, with different sets of monitored variables, are observed and compared. We conclude that the hybrid FDI strategy, realized by combining a model-based scheme with a statistical classifier, outperforms the other strategies. In addition, the inclusion of two physical variables that should be measured inside the SOFCs can significantly improve the FDI performance, despite the actual difficulty in performing such measurements. PMID:27556472

  3. rSeqNP: a non-parametric approach for detecting differential expression and splicing from RNA-Seq data.

    PubMed

    Shi, Yang; Chinnaiyan, Arul M; Jiang, Hui

    2015-07-01

    High-throughput sequencing of transcriptomes (RNA-Seq) has become a powerful tool to study gene expression. Here we present an R package, rSeqNP, which implements a non-parametric approach to test for differential expression and splicing from RNA-Seq data. rSeqNP uses permutation tests to access statistical significance and can be applied to a variety of experimental designs. By combining information across isoforms, rSeqNP is able to detect more differentially expressed or spliced genes from RNA-Seq data. The R package with its source code and documentation are freely available at http://www-personal.umich.edu/∼jianghui/rseqnp/. jianghui@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Development of Raman Spectroscopy as a Clinical Diagnostic Tool

    NASA Astrophysics Data System (ADS)

    Borel, Santa

    Raman spectroscopy is the collection of inelastically scattered light in which the spectra contain biochemical information of the probed cells or tissue. This work presents both targeted and untargeted ways that the technique can be exploited in biological samples. First, surface enhanced Raman scattering (SERS) gold nanoparticles conjugated to targeting antibodies were shown to be successful for multiplexed detection of overexpressed surface antigens in lung cancer cell lines. Further work will need to optimize the conjugation technique to preserve the strong binding affinity of the antibodies. Second, untargeted Raman microspectroscopy combined with multivariate statistical analysis was able to successfully differentiate mouse ovarian surface epithelial (MOSE) cells and spontaneously transformed ovarian surface epithelial (STOSE) cells with high accuracy. The differences between the two groups were associated with increased nucleic acid content in the STOSE cells. This shows potential for single cell detection of ovarian cancer.

  5. Detection of severe Midwest thunderstorms using geosynchronous satellite data

    NASA Technical Reports Server (NTRS)

    Adler, R. F.; Markus, M. J.; Fenn, D. D.

    1985-01-01

    In the present exploration of the effectiveness of severe thunderstorm detection in the Midwestern region of the U.S. by means of approximately 5-min interval geosynchronous satellite data, thunderstorms are defined in IR data as points of relative minimum in brightness temperature T(B) having good time continuity and exhibiting a period of rapid growth. The four parameters of rate of T(B) decrease in the upper troposphere and stratosphere, isotherm expansion, and storm lifetime minimum T(B), are shown to be statistically related to the occurrence of severe weather on four case study days and are combined into a Thunderstorm Index which varies among values from 1 to 9. Storms rating higher than 6 have a much higher probability of severe weather reports, yielding a warning time lead of 15 min for hail and 30 min for the first tornado report.

  6. Single-particle detection of products from atomic and molecular reactions in a cryogenic ion storage ring

    NASA Astrophysics Data System (ADS)

    Krantz, C.; Novotný, O.; Becker, A.; George, S.; Grieser, M.; Hahn, R. von; Meyer, C.; Schippers, S.; Spruck, K.; Vogel, S.; Wolf, A.

    2017-04-01

    We have used a single-particle detector system, based on secondary electron emission, for counting low-energetic (∼keV/u) massive products originating from atomic and molecular ion reactions in the electrostatic Cryogenic Storage Ring (CSR). The detector is movable within the cryogenic vacuum chamber of CSR, and was used to measure production rates of a variety of charged and neutral daughter particles. In operation at a temperature of ∼ 6 K , the detector is characterised by a high dynamic range, combining a low dark event rate with good high-rate particle counting capability. On-line measurement of the pulse height distributions proved to be an important monitor of the detector response at low temperature. Statistical pulse-height analysis allows to infer the particle detection efficiency of the detector, which has been found to be close to unity also in cryogenic operation at 6 K.

  7. Mapping quantitative trait loci for traits defined as ratios.

    PubMed

    Yang, Runqing; Li, Jiahan; Xu, Shizhong

    2008-03-01

    Many traits are defined as ratios of two quantitative traits. Methods of QTL mapping for regular quantitative traits are not optimal when applied to ratios due to lack of normality for traits defined as ratios. We develop a new method of QTL mapping for traits defined as ratios. The new method uses a special linear combination of the two component traits, and thus takes advantage of the normal property of the new variable. Simulation study shows that the new method can substantially increase the statistical power of QTL detection relative to the method which treats ratios as regular quantitative traits. The new method also outperforms the method that uses Box-Cox transformed ratio as the phenotype. A real example of QTL mapping for relative growth rate in soybean demonstrates that the new method can detect more QTL than existing methods of QTL mapping for traits defined as ratios.

  8. Implementation of jump-diffusion algorithms for understanding FLIR scenes

    NASA Astrophysics Data System (ADS)

    Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.

    1995-07-01

    Our pattern theoretic approach to the automated understanding of forward-looking infrared (FLIR) images brings the traditionally separate endeavors of detection, tracking, and recognition together into a unified jump-diffusion process. New objects are detected and object types are recognized through discrete jump moves. Between jumps, the location and orientation of objects are estimated via continuous diffusions. An hypothesized scene, simulated from the emissive characteristics of the hypothesized scene elements, is compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. The jump-diffusion process empirically generates the posterior distribution. Both the diffusion and jump operations involve the simulation of a scene produced by a hypothesized configuration. Scene simulation is most effectively accomplished by pipelined rendering engines such as silicon graphics. We demonstrate the execution of our algorithm on a silicon graphics onyx/reality engine.

  9. Relationship between oral motor dysfunction and oral bacteria in bedridden elderly.

    PubMed

    Tada, Akio; Shiiba, Masashi; Yokoe, Hidetaka; Hanada, Nobuhiro; Tanzawa, Hideki

    2004-08-01

    The purpose of this study was to analyze the relationship between oral bacterial colonization and oral motor dysfunction. Oral motor dysfunction (swallowing and speech disorders) and detection of oral bacterial species from dental plaque in 55 elderly persons who had remained hospitalized for more than 3 months were investigated and statistically analyzed. The detection rates of methicillin-resistant Staphylococcus aureus (MRSA), Pseudomonas aeruginosa, Streptococcus agalactiae, and Stenotrophomonas maltophilia were significantly higher in subjects with than in those without a swallowing disorder. A similar result was found with regard to the presence of a speech disorder. About half of subjects who had oral motor dysfunction and hypoalbuminemia had colonization by MRSA and/or Pseudomonas aeruginosa. These results suggest that the combination of oral motor dysfunction and hypoalbminemia elevated the risk of opportunistic microorganisms colonization in the oral cavity of elderly patients hospitalized over the long term.

  10. Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes

    NASA Astrophysics Data System (ADS)

    Rosenfeld, Wenjamin; Burchardt, Daniel; Garthoff, Robert; Redeker, Kai; Ortegel, Norbert; Rau, Markus; Weinfurter, Harald

    2017-07-01

    An experimental test of Bell's inequality allows ruling out any local-realistic description of nature by measuring correlations between distant systems. While such tests are conceptually simple, there are strict requirements concerning the detection efficiency of the involved measurements, as well as the enforcement of spacelike separation between the measurement events. Only very recently could both loopholes be closed simultaneously. Here we present a statistically significant, event-ready Bell test based on combining heralded entanglement of atoms separated by 398 m with fast and efficient measurements of the atomic spin states closing essential loopholes. We obtain a violation with S =2.221 ±0.033 (compared to the maximal value of 2 achievable with models based on local hidden variables) which allows us to refute the hypothesis of local realism with a significance level P <2.57 ×10-9.

  11. An XMM-Newton Study of the Bright Ultrasoft Narrow-Line Quasar NAB 0205+024

    NASA Technical Reports Server (NTRS)

    Brandt, Niel

    2004-01-01

    The broad-band X-ray continuum of NAB 0205424 is well constrained due to the excellent photon statistics obtained (about 97,700 counts), and its impressive soft X-ray excess is clearly apparent. The hard X-ray power law has become notably steeper than when NAB 0205424 was observed with ASCA, attesting to the presence of significant X-ray spectral variability. A strong and broad emission feature is detected from about 5 to 6.4 keV, and we have modeled this as a relativistic line emitted close to the black hole from a narrow annulus of the accretion disk. Furthermore, a strong X-ray flare is detected with a hard X-ray spectrum; this flare may be responsible for illuminating the inner line-emitting part of the accretion disk. The combined observational results can be broadly interpreted in terms of the "thundercloud model proposed by Merloni & Fabian (2001).

  12. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction

    PubMed Central

    Cruz Zurian, Heber; Atefi, Seyed Reza; Seoane Martinez, Fernando; Lukowicz, Paul

    2017-01-01

    In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments’ contribution. The best performing feature-classifier combination can recognize the gestures with a 93.3% accuracy from a known group of participants, and 89.1% from strangers. PMID:29120389

  13. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction.

    PubMed

    Zhou, Bo; Altamirano, Carlos Andres Velez; Zurian, Heber Cruz; Atefi, Seyed Reza; Billing, Erik; Martinez, Fernando Seoane; Lukowicz, Paul

    2017-11-09

    In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments' contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers.

  14. Simultaneous use of multiplex ligation-dependent probe amplification assay and flow cytometric DNA ploidy analysis in patients with acute leukemia.

    PubMed

    Reyes-Núñez, Virginia; Galo-Hooker, Evelyn; Pérez-Romano, Beatriz; Duque, Ricardo E; Ruiz-Arguelles, Alejandro; Garcés-Eisele, Javier

    2018-01-01

    The aim of this work was to simultaneously use multiplex ligation-dependent probe amplification (MLPA) assay and flow cytometric DNA ploidy analysis (FPA) to detect aneuploidy in patients with newly diagnosed acute leukemia. MLPA assay and propidium iodide FPA were used to test samples from 53 consecutive patients with newly diagnosed acute leukemia referred to our laboratory for immunophenotyping. Results were compared by nonparametric statistics. The combined use of both methods significantly increased the rate of detection of aneuploidy as compared to that obtained by each method alone. The limitations of one method are somehow countervailed by the other and vice versa. MPLA and FPA yield different yet complementary information concerning aneuploidy in acute leukemia. The simultaneous use of both methods might be recommended in the clinical setting. © 2017 International Clinical Cytometry Society. © 2017 International Clinical Cytometry Society.

  15. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Immunoelectron microscopy of RNA combined with nucleic acid cytochemistry in plant nucleoli.

    PubMed

    Mena, C G; Testillano, P S; González-Melendi, P; Gorab, E; Risueño, M C

    1994-06-01

    The immunoelectron microscopy detection of RNA using anti-RNA monoclonal antibodies has been performed for the first time over different plant cells. The use of the methylation-acetylation (MA) method permits clear distinction among the nuclear and nucleolar compartments and can be combined with the immunogold approach. Cytochemical methods for nucleic acids were performed together with the immunoassays, providing additional data about the different composition of the various nucleolar components. Anti-RNA antibodies highly labeled the ribosome-rich areas of the cytoplasm and the nucleolus. The interchromatin region also is labeled. The labeling was intense in the granular component, lower in the dense fibrillar component, and very scarce in the fibrillar centers. The MA method made possible the statistical evaluation of the labeling density in the various nuclear compartments by permitting the clear assignment of the particles to precise nuclear structures.

  17. Detecting Answer Copying Using Alternate Test Forms and Seat Locations in Small-Scale Examinations

    ERIC Educational Resources Information Center

    van der Ark, L. Andries; Emons, Wilco H. M.; Sijtsma, Klaas

    2008-01-01

    Two types of answer-copying statistics for detecting copiers in small-scale examinations are proposed. One statistic identifies the "copier-source" pair, and the other in addition suggests who is copier and who is source. Both types of statistics can be used when the examination has alternate test forms. A simulation study shows that the…

  18. Enhanced data validation strategy of air quality monitoring network.

    PubMed

    Harkat, Mohamed-Faouzi; Mansouri, Majdi; Nounou, Mohamed; Nounou, Hazem

    2018-01-01

    Quick validation and detection of faults in measured air quality data is a crucial step towards achieving the objectives of air quality networks. Therefore, the objectives of this paper are threefold: (i) to develop a modeling technique that can be used to predict the normal behavior of air quality variables and help provide accurate reference for monitoring purposes; (ii) to develop fault detection method that can effectively and quickly detect any anomalies in measured air quality data. For this purpose, a new fault detection method that is based on the combination of generalized likelihood ratio test (GLRT) and exponentially weighted moving average (EWMA) will be developed. GLRT is a well-known statistical fault detection method that relies on maximizing the detection probability for a given false alarm rate. In this paper, we propose to develop GLRT-based EWMA fault detection method that will be able to detect the changes in the values of certain air quality variables; (iii) to develop fault isolation and identification method that allows defining the fault source(s) in order to properly apply appropriate corrective actions. In this paper, reconstruction approach that is based on Midpoint-Radii Principal Component Analysis (MRPCA) model will be developed to handle the types of data and models associated with air quality monitoring networks. All air quality modeling, fault detection, fault isolation and reconstruction methods developed in this paper will be validated using real air quality data (such as particulate matter, ozone, nitrogen and carbon oxides measurement). Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Detecting rater bias using a person-fit statistic: a Monte Carlo simulation study.

    PubMed

    Aubin, André-Sébastien; St-Onge, Christina; Renaud, Jean-Sébastien

    2018-04-01

    With the Standards voicing concern for the appropriateness of response processes, we need to explore strategies that would allow us to identify inappropriate rater response processes. Although certain statistics can be used to help detect rater bias, their use is complicated by either a lack of data about their actual power to detect rater bias or the difficulty related to their application in the context of health professions education. This exploratory study aimed to establish the worthiness of pursuing the use of l z to detect rater bias. We conducted a Monte Carlo simulation study to investigate the power of a specific detection statistic, that is: the standardized likelihood l z person-fit statistics (PFS). Our primary outcome was the detection rate of biased raters, namely: raters whom we manipulated into being either stringent (giving lower scores) or lenient (giving higher scores), using the l z statistic while controlling for the number of biased raters in a sample (6 levels) and the rate of bias per rater (6 levels). Overall, stringent raters (M = 0.84, SD = 0.23) were easier to detect than lenient raters (M = 0.31, SD = 0.28). More biased raters were easier to detect then less biased raters (60% bias: 62, SD = 0.37; 10% bias: 43, SD = 0.36). The PFS l z seems to offer an interesting potential to identify biased raters. We observed detection rates as high as 90% for stringent raters, for whom we manipulated more than half their checklist. Although we observed very interesting results, we cannot generalize these results to the use of PFS with estimated item/station parameters or real data. Such studies should be conducted to assess the feasibility of using PFS to identify rater bias.

  20. Quantum walks: The first detected passage time problem

    NASA Astrophysics Data System (ADS)

    Friedman, H.; Kessler, D. A.; Barkai, E.

    2017-03-01

    Even after decades of research, the problem of first passage time statistics for quantum dynamics remains a challenging topic of fundamental and practical importance. Using a projective measurement approach, with a sampling time τ , we obtain the statistics of first detection events for quantum dynamics on a lattice, with the detector located at the origin. A quantum renewal equation for a first detection wave function, in terms of which the first detection probability can be calculated, is derived. This formula gives the relation between first detection statistics and the solution of the corresponding Schrödinger equation in the absence of measurement. We illustrate our results with tight-binding quantum walk models. We examine a closed system, i.e., a ring, and reveal the intricate influence of the sampling time τ on the statistics of detection, discussing the quantum Zeno effect, half dark states, revivals, and optimal detection. The initial condition modifies the statistics of a quantum walk on a finite ring in surprising ways. In some cases, the average detection time is independent of the sampling time while in others the average exhibits multiple divergences as the sampling time is modified. For an unbounded one-dimensional quantum walk, the probability of first detection decays like (time)(-3 ) with superimposed oscillations, with exceptional behavior when the sampling period τ times the tunneling rate γ is a multiple of π /2 . The amplitude of the power-law decay is suppressed as τ →0 due to the Zeno effect. Our work, an extended version of our previously published paper, predicts rich physical behaviors compared with classical Brownian motion, for which the first passage probability density decays monotonically like (time)-3 /2, as elucidated by Schrödinger in 1915.

Top