Sample records for detect statistical differences

  1. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  2. Decisions that Make a Difference in Detecting Differential Item Functioning

    ERIC Educational Resources Information Center

    Sireci, Stephen G.; Rios, Joseph A.

    2013-01-01

    There are numerous statistical procedures for detecting items that function differently across subgroups of examinees that take a test or survey. However, in endeavouring to detect items that may function differentially, selection of the statistical method is only one of many important decisions. In this article, we discuss the important decisions…

  3. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    ERIC Educational Resources Information Center

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  4. Got power? A systematic review of sample size adequacy in health professions education research.

    PubMed

    Cook, David A; Hatala, Rose

    2015-03-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.

  5. a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors

    NASA Astrophysics Data System (ADS)

    Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.

    2018-04-01

    Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.

  6. A flexibly shaped space-time scan statistic for disease outbreak detection and monitoring.

    PubMed

    Takahashi, Kunihiko; Kulldorff, Martin; Tango, Toshiro; Yih, Katherine

    2008-04-11

    Early detection of disease outbreaks enables public health officials to implement disease control and prevention measures at the earliest possible time. A time periodic geographical disease surveillance system based on a cylindrical space-time scan statistic has been used extensively for disease surveillance along with the SaTScan software. In the purely spatial setting, many different methods have been proposed to detect spatial disease clusters. In particular, some spatial scan statistics are aimed at detecting irregularly shaped clusters which may not be detected by the circular spatial scan statistic. Based on the flexible purely spatial scan statistic, we propose a flexibly shaped space-time scan statistic for early detection of disease outbreaks. The performance of the proposed space-time scan statistic is compared with that of the cylindrical scan statistic using benchmark data. In order to compare their performances, we have developed a space-time power distribution by extending the purely spatial bivariate power distribution. Daily syndromic surveillance data in Massachusetts, USA, are used to illustrate the proposed test statistic. The flexible space-time scan statistic is well suited for detecting and monitoring disease outbreaks in irregularly shaped areas.

  7. An Unsupervised Change Detection Method Using Time-Series of PolSAR Images from Radarsat-2 and GaoFen-3.

    PubMed

    Liu, Wensong; Yang, Jie; Zhao, Jinqi; Shi, Hongtao; Yang, Le

    2018-02-12

    The traditional unsupervised change detection methods based on the pixel level can only detect the changes between two different times with same sensor, and the results are easily affected by speckle noise. In this paper, a novel method is proposed to detect change based on time-series data from different sensors. Firstly, the overall difference image of the time-series PolSAR is calculated by omnibus test statistics, and difference images between any two images in different times are acquired by R j test statistics. Secondly, the difference images are segmented with a Generalized Statistical Region Merging (GSRM) algorithm which can suppress the effect of speckle noise. Generalized Gaussian Mixture Model (GGMM) is then used to obtain the time-series change detection maps in the final step of the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection using time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can not only detect the time-series change from different sensors, but it can also better suppress the influence of speckle noise and improve the overall accuracy and Kappa coefficient.

  8. Soil carbon inventories under a bioenergy crop (switchgrass): Measurement limitations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garten, C.T. Jr.; Wullschleger, S.D.

    Approximately 5 yr after planting, coarse root carbon (C) and soil organic C (SOC) inventories were compared under different types of plant cover at four switchgrass (Panicum virgatum L.) production field trials in the southeastern USA. There was significantly more coarse root C under switchgrass (Alamo variety) and forest cover than tall fescue (Festuca arundinacea Schreb.), corn (Zea mays L.), or native pastures of mixed grasses. Inventories of SOC under switchgrass were not significantly greater than SOC inventories under other plant covers. At some locations the statistical power associated with ANOVA of SOC inventories was low, which raised questions aboutmore » whether differences in SOC could be detected statistically. A minimum detectable difference (MDD) for SOC inventories was calculated. The MDD is the smallest detectable difference between treatment means once the variation, significance level, statistical power, and sample size are specified. The analysis indicated that a difference of {approx}50 mg SOC/cm{sup 2} or 5 Mg SOC/ha, which is {approx}10 to 15% of existing SOC, could be detected with reasonable sample sizes and good statistical power. The smallest difference in SOC inventories that can be detected, and only with exceedingly large sample sizes, is {approx}2 to 3%. These measurement limitations have implications for monitoring and verification of proposals to ameliorate increasing global atmospheric CO{sub 2} concentrations by sequestering C in soils.« less

  9. Human papillomavirus detection with genotyping by the cobas and Aptima assays: Significant differences in HPV 16 detection?

    PubMed

    Chorny, Joseph A; Frye, Teresa C; Fisher, Beth L; Remmers, Carol L

    2018-03-23

    The primary high-risk human papillomavirus (hrHPV) assays in the United States are the cobas (Roche) and the Aptima (Hologic). The cobas assay detects hrHPV by DNA analysis while the Aptima detects messenger RNA (mRNA) oncogenic transcripts. As the Aptima assay identifies oncogenic expression, it should have a lower rate of hrHPV and genotype detection. The Kaiser Permanente Regional Reference Laboratory in Denver, Colorado changed its hrHPV assay from the cobas to the Aptima assay. The rates of hrHPV detection and genotyping were compared over successive six-month periods. The overall hrHPV detection rates by the two platforms were similar (9.5% versus 9.1%) and not statistically different. For genotyping, the HPV 16 rate by the cobas was 1.6% and by the Aptima it was 1.1%. These differences were statistically different with the Aptima detecting nearly one-third less HPV 16 infections. With the HPV 18 and HPV 18/45, there was a slightly higher detection rate of HPV 18/45 by the Aptima platform (0.5% versus 0.9%) and this was statistically significant. While HPV 16 represents a low percentage of hrHPV infections, it was detected significantly less by the Aptima assay compared to the cobas assay. This has been previously reported, although not highlighted. Given the test methodologies, one would expect the Aptima to detect less HPV 16. This difference appears to be mainly due to a significantly increased number of non-oncogenic HPV 16 infections detected by the cobas test as there were no differences in HPV 16 detection rates in the high-grade squamous intraepithelial lesions indicating that the two tests have similar sensitivities for oncogenic HPV 16. © 2018 Wiley Periodicals, Inc.

  10. Outliers in Questionnaire Data: Can They Be Detected and Should They Be Removed?

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Outliers in questionnaire data are unusual observations, which may bias statistical results, and outlier statistics may be used to detect such outliers. The authors investigated the effect outliers have on the specificity and the sensitivity of each of six different outlier statistics. The Mahalanobis distance and the item-pair based outlier…

  11. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less

  12. Mysid (Mysidopsis bahia) life-cycle test: Design comparisons and assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lussier, S.M.; Champlin, D.; Kuhn, A.

    1996-12-31

    This study examines ASTM Standard E1191-90, ``Standard Guide for Conducting Life-cycle Toxicity Tests with Saltwater Mysids,`` 1990, using Mysidopsis bahia, by comparing several test designs to assess growth, reproduction, and survival. The primary objective was to determine the most labor efficient and statistically powerful test design for the measurement of statistically detectable effects on biologically sensitive endpoints. Five different test designs were evaluated varying compartment size, number of organisms per compartment and sex ratio. Results showed that while paired organisms in the ASTM design had the highest rate of reproduction among designs tested, no individual design had greater statistical powermore » to detect differences in reproductive effects. Reproduction was not statistically different between organisms paired in the ASTM design and those with randomized sex ratios using larger test compartments. These treatments had numerically higher reproductive success and lower within tank replicate variance than treatments using smaller compartments where organisms were randomized, or had a specific sex ratio. In this study, survival and growth were not statistically different among designs tested. Within tank replicate variability can be reduced by using many exposure compartments with pairs, or few compartments with many organisms in each. While this improves variance within replicate chambers, it does not strengthen the power of detection among treatments in the test. An increase in the number of true replicates (exposure chambers) to eight will have the effect of reducing the percent detectable difference by a factor of two.« less

  13. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  14. Enhancing the mathematical properties of new haplotype homozygosity statistics for the detection of selective sweeps.

    PubMed

    Garud, Nandita R; Rosenberg, Noah A

    2015-06-01

    Soft selective sweeps represent an important form of adaptation in which multiple haplotypes bearing adaptive alleles rise to high frequency. Most statistical methods for detecting selective sweeps from genetic polymorphism data, however, have focused on identifying hard selective sweeps in which a favored allele appears on a single haplotypic background; these methods might be underpowered to detect soft sweeps. Among exceptions is the set of haplotype homozygosity statistics introduced for the detection of soft sweeps by Garud et al. (2015). These statistics, examining frequencies of multiple haplotypes in relation to each other, include H12, a statistic designed to identify both hard and soft selective sweeps, and H2/H1, a statistic that conditional on high H12 values seeks to distinguish between hard and soft sweeps. A challenge in the use of H2/H1 is that its range depends on the associated value of H12, so that equal H2/H1 values might provide different levels of support for a soft sweep model at different values of H12. Here, we enhance the H12 and H2/H1 haplotype homozygosity statistics for selective sweep detection by deriving the upper bound on H2/H1 as a function of H12, thereby generating a statistic that normalizes H2/H1 to lie between 0 and 1. Through a reanalysis of resequencing data from inbred lines of Drosophila, we show that the enhanced statistic both strengthens interpretations obtained with the unnormalized statistic and leads to empirical insights that are less readily apparent without the normalization. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Detection of semi-volatile organic compounds in permeable ...

    EPA Pesticide Factsheets

    Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame

  16. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  17. A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko

    2012-12-30

    Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  19. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    USGS Publications Warehouse

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  20. Proper Image Subtraction—Optimal Transient Detection, Photometry, and Hypothesis Testing

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.; Gal-Yam, Avishay

    2016-10-01

    Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement, and any image-difference hypothesis testing. We derive a closed-form statistic that: (1) is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise, (2) is numerically stable, (3) for accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts, (4) allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance, (5) has uncorrelated white noise, (6) is a sufficient statistic for any further statistical test on the difference image, and, in particular, allows us to distinguish particle hits and other image artifacts from real transients, (7) is symmetric to the exchange of the new and reference images, (8) is at least an order of magnitude faster to compute than some popular methods, and (9) is straightforward to implement. Furthermore, we present extensions of this method that make it resilient to registration errors, color-refraction errors, and any noise source that can be modeled. In addition, we show that the optimal way to prepare a reference image is the proper image coaddition presented in Zackay & Ofek. We demonstrate this method on simulated data and real observations from the PTF data release 2. We provide an implementation of this algorithm in MATLAB and Python.

  1. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    PubMed

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  2. Practical steganalysis of digital images: state of the art

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav

    2002-04-01

    Steganography is the art of hiding the very presence of communication by embedding secret messages into innocuous looking cover documents, such as digital images. Detection of steganography, estimation of message length, and its extraction belong to the field of steganalysis. Steganalysis has recently received a great deal of attention both from law enforcement and the media. In our paper, we classify and review current stego-detection algorithms that can be used to trace popular steganographic products. We recognize several qualitatively different approaches to practical steganalysis - visual detection, detection based on first order statistics (histogram analysis), dual statistics methods that use spatial correlations in images and higher-order statistics (RS steganalysis), universal blind detection schemes, and special cases, such as JPEG compatibility steganalysis. We also present some new results regarding our previously proposed detection of LSB embedding using sensitive dual statistics. The recent steganalytic methods indicate that the most common paradigm in image steganography - the bit-replacement or bit substitution - is inherently insecure with safe capacities far smaller than previously thought.

  3. Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus.

    PubMed

    Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana-Helena; Garcia-Santos-Silva, Maria-Alves; Francisco-De-Mendonça, Elismauro; Estrela, Carlos

    2013-01-01

    To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). CBCT scanning detect MRCMS more accurately than panoramic radiography.

  4. Autoregressive statistical pattern recognition algorithms for damage detection in civil structures

    NASA Astrophysics Data System (ADS)

    Yao, Ruigen; Pakzad, Shamim N.

    2012-08-01

    Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.

  5. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  6. Principles of Statistics: What the Sports Medicine Professional Needs to Know.

    PubMed

    Riemann, Bryan L; Lininger, Monica R

    2018-07-01

    Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. The statistical average of optical properties for alumina particle cluster in aircraft plume

    NASA Astrophysics Data System (ADS)

    Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin

    2018-04-01

    We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.

  8. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    PubMed

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  9. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model

    PubMed Central

    Austin, Peter C.

    2017-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694

  10. Distinguishing Positive Selection From Neutral Evolution: Boosting the Performance of Summary Statistics

    PubMed Central

    Lin, Kao; Li, Haipeng; Schlötterer, Christian; Futschik, Andreas

    2011-01-01

    Summary statistics are widely used in population genetics, but they suffer from the drawback that no simple sufficient summary statistic exists, which captures all information required to distinguish different evolutionary hypotheses. Here, we apply boosting, a recent statistical method that combines simple classification rules to maximize their joint predictive performance. We show that our implementation of boosting has a high power to detect selective sweeps. Demographic events, such as bottlenecks, do not result in a large excess of false positives. A comparison to other neutrality tests shows that our boosting implementation performs well compared to other neutrality tests. Furthermore, we evaluated the relative contribution of different summary statistics to the identification of selection and found that for recent sweeps integrated haplotype homozygosity is very informative whereas older sweeps are better detected by Tajima's π. Overall, Watterson's θ was found to contribute the most information for distinguishing between bottlenecks and selection. PMID:21041556

  11. Low power and type II errors in recent ophthalmology research.

    PubMed

    Khan, Zainab; Milko, Jordan; Iqbal, Munir; Masri, Moness; Almeida, David R P

    2016-10-01

    To investigate the power of unpaired t tests in prospective, randomized controlled trials when these tests failed to detect a statistically significant difference and to determine the frequency of type II errors. Systematic review and meta-analysis. We examined all prospective, randomized controlled trials published between 2010 and 2012 in 4 major ophthalmology journals (Archives of Ophthalmology, British Journal of Ophthalmology, Ophthalmology, and American Journal of Ophthalmology). Studies that used unpaired t tests were included. Power was calculated using the number of subjects in each group, standard deviations, and α = 0.05. The difference between control and experimental means was set to be (1) 20% and (2) 50% of the absolute value of the control's initial conditions. Power and Precision version 4.0 software was used to carry out calculations. Finally, the proportion of articles with type II errors was calculated. β = 0.3 was set as the largest acceptable value for the probability of type II errors. In total, 280 articles were screened. Final analysis included 50 prospective, randomized controlled trials using unpaired t tests. The median power of tests to detect a 50% difference between means was 0.9 and was the same for all 4 journals regardless of the statistical significance of the test. The median power of tests to detect a 20% difference between means ranged from 0.26 to 0.9 for the 4 journals. The median power of these tests to detect a 50% and 20% difference between means was 0.9 and 0.5 for tests that did not achieve statistical significance. A total of 14% and 57% of articles with negative unpaired t tests contained results with β > 0.3 when power was calculated for differences between means of 50% and 20%, respectively. A large portion of studies demonstrate high probabilities of type II errors when detecting small differences between means. The power to detect small difference between means varies across journals. It is, therefore, worthwhile for authors to mention the minimum clinically important difference for individual studies. Journals can consider publishing statistical guidelines for authors to use. Day-to-day clinical decisions rely heavily on the evidence base formed by the plethora of studies available to clinicians. Prospective, randomized controlled clinical trials are highly regarded as a robust study and are used to make important clinical decisions that directly affect patient care. The quality of study designs and statistical methods in major clinical journals is improving overtime, 1 and researchers and journals are being more attentive to statistical methodologies incorporated by studies. The results of well-designed ophthalmic studies with robust methodologies, therefore, have the ability to modify the ways in which diseases are managed. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  12. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    PubMed

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  13. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  14. Statistical comparison of pooled nitrogen washout data of various altitude decompression response groups

    NASA Technical Reports Server (NTRS)

    Edwards, B. F.; Waligora, J. M.; Horrigan, D. J., Jr.

    1985-01-01

    This analysis was done to determine whether various decompression response groups could be characterized by the pooled nitrogen (N2) washout profiles of the group members, pooling individual washout profiles provided a smooth time dependent function of means representative of the decompression response group. No statistically significant differences were detected. The statistical comparisons of the profiles were performed by means of univariate weighted t-test at each 5 minute profile point, and with levels of significance of 5 and 10 percent. The estimated powers of the tests (i.e., probabilities) to detect the observed differences in the pooled profiles were of the order of 8 to 30 percent.

  15. Properties of different selection signature statistics and a new strategy for combining them.

    PubMed

    Ma, Y; Ding, X; Qanbari, S; Weigend, S; Zhang, Q; Simianer, H

    2015-11-01

    Identifying signatures of recent or ongoing selection is of high relevance in livestock population genomics. From a statistical perspective, determining a proper testing procedure and combining various test statistics is challenging. On the basis of extensive simulations in this study, we discuss the statistical properties of eight different established selection signature statistics. In the considered scenario, we show that a reasonable power to detect selection signatures is achieved with high marker density (>1 SNP/kb) as obtained from sequencing, while rather small sample sizes (~15 diploid individuals) appear to be sufficient. Most selection signature statistics such as composite likelihood ratio and cross population extended haplotype homozogysity have the highest power when fixation of the selected allele is reached, while integrated haplotype score has the highest power when selection is ongoing. We suggest a novel strategy, called de-correlated composite of multiple signals (DCMS) to combine different statistics for detecting selection signatures while accounting for the correlation between the different selection signature statistics. When examined with simulated data, DCMS consistently has a higher power than most of the single statistics and shows a reliable positional resolution. We illustrate the new statistic to the established selective sweep around the lactase gene in human HapMap data providing further evidence of the reliability of this new statistic. Then, we apply it to scan selection signatures in two chicken samples with diverse skin color. Our analysis suggests that a set of well-known genes such as BCO2, MC1R, ASIP and TYR were involved in the divergent selection for this trait.

  16. Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus

    PubMed Central

    Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana H.; Garcia-Santos-Silva, Maria A.; Francisco-De-Mendonça, Elismauro

    2013-01-01

    Objective: To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). Study Design: A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. Results: There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). Conclusion: CBCT scanning detect MRCMS more accurately than panoramic radiography. Key words:Mucous cyst, maxillary sinus, panoramic radiograph, cone beam computed tomography. PMID:23229251

  17. Statistical detection of patterns in unidimensional distributions by continuous wavelet transforms

    NASA Astrophysics Data System (ADS)

    Baluev, R. V.

    2018-04-01

    Objective detection of specific patterns in statistical distributions, like groupings or gaps or abrupt transitions between different subsets, is a task with a rich range of applications in astronomy: Milky Way stellar population analysis, investigations of the exoplanets diversity, Solar System minor bodies statistics, extragalactic studies, etc. We adapt the powerful technique of the wavelet transforms to this generalized task, making a strong emphasis on the assessment of the patterns detection significance. Among other things, our method also involves optimal minimum-noise wavelets and minimum-noise reconstruction of the distribution density function. Based on this development, we construct a self-closed algorithmic pipeline aimed to process statistical samples. It is currently applicable to single-dimensional distributions only, but it is flexible enough to undergo further generalizations and development.

  18. Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.

    PubMed

    Banks, N C; Hodda, M; Singh, S K; Matveeva, E M

    2012-06-01

    Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.

  19. Statistical sensor fusion analysis of near-IR polarimetric and thermal imagery for the detection of minelike targets

    NASA Astrophysics Data System (ADS)

    Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.

    1999-02-01

    We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.

  20. Investigating the Investigative Task: Testing for Skewness--An Investigation of Different Test Statistics and Their Power to Detect Skewness

    ERIC Educational Resources Information Center

    Tabor, Josh

    2010-01-01

    On the 2009 AP[c] Statistics Exam, students were asked to create a statistic to measure skewness in a distribution. This paper explores several of the most popular student responses and evaluates which statistic performs best when sampling from various skewed populations. (Contains 8 figures, 3 tables, and 4 footnotes.)

  1. MO-DE-207A-01: Impact of Statistical Weights On Detection of Low-Contrast Details in Model-Based Iterative CT Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noo, F; Guo, Z

    2016-06-15

    Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used withmore » two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It may be that different results would be obtained if the penalty term was used with a pixel-dependent weight. F Noo receives research support from Siemens Healthcare GmbH.« less

  2. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain)

    NASA Astrophysics Data System (ADS)

    Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  3. The Importance of Teaching Power in Statistical Hypothesis Testing

    ERIC Educational Resources Information Center

    Olinsky, Alan; Schumacher, Phyllis; Quinn, John

    2012-01-01

    In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…

  4. Insight From the Statistics of Nothing: Estimating Limits of Change Detection Using Inferred No-Change Areas in DEM Difference Maps and Application to Landslide Hazard Studies

    NASA Astrophysics Data System (ADS)

    Haneberg, W. C.

    2017-12-01

    Remote characterization of new landslides or areas of ongoing movement using differences in high resolution digital elevation models (DEMs) created through time, for example before and after major rains or earthquakes, is an attractive proposition. In the case of large catastrophic landslides, changes may be apparent enough that simple subtraction suffices. In other cases, statistical noise can obscure landslide signatures and place practical limits on detection. In ideal cases on land, GPS surveys of representative areas at the time of DEM creation can quantify the inherent errors. In less-than-ideal terrestrial cases and virtually all submarine cases, it may be impractical or impossible to independently estimate the DEM errors. Examining DEM difference statistics for areas reasonably inferred to have no change, however, can provide insight into the limits of detectability. Data from inferred no-change areas of airborne LiDAR DEM difference maps of the 2014 Oso, Washington landslide and landslide-prone colluvium slopes along the Ohio River valley in northern Kentucky, show that DEM difference maps can have non-zero mean and slope dependent error components consistent with published studies of DEM errors. Statistical thresholds derived from DEM difference error and slope data can help to distinguish between DEM differences that are likely real—and which may indicate landsliding—from those that are likely spurious or irrelevant. This presentation describes and compares two different approaches, one based upon a heuristic assumption about the proportion of the study area likely covered by new landslides and another based upon the amount of change necessary to ensure difference at a specified level of probability.

  5. Leads Detection Using Mixture Statistical Distribution Based CRF Algorithm from Sentinel-1 Dual Polarization SAR Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting

    2017-04-01

    Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a pixel spacing of 40 meters near Prydz Bay area, East Antarctica. Main work is listed as follows: 1) A mixture statistical distribution based CRF algorithm has been developed for leads detection from Sentinel-1A dual polarization images. 2) The assessment of the proposed mixture statistical distribution based CRF method and single distribution based CRF algorithm has been presented. 3) The preferable parameters sets including statistical distributions, the aspect ratio threshold and spatial smoothing window size have been provided. In the future, the proposed algorithm will be developed for the operational Sentinel series data sets processing due to its less time consuming cost and high accuracy in leads detection.

  6. Comparison of four different methods for detection of biofilm formation by uropathogens.

    PubMed

    Panda, Pragyan Swagatika; Chaudhary, Uma; Dube, Surya K

    2016-01-01

    Urinary tract infection (UTI) is one of the most common infectious diseases encountered in clinical practice. Emerging resistance of the uropathogens to the antimicrobial agents due to biofilm formation is a matter of concern while treating symptomatic UTI. However, studies comparing different methods for detection of biofilm by uropathogens are scarce. To compare four different methods for detection of biofilm formation by uropathogens. Prospective observational study conducted in a tertiary care hospital. Totally 300 isolates from urinary samples were analyzed for biofilm formation by four methods, that is, tissue culture plate (TCP) method, tube method (TM), Congo Red Agar (CRA) method and modified CRA (MCRA) method. Chi-square test was applied when two or more set of variables were compared. P < 0.05 considered as statistically significant. Considering TCP to be a gold standard method for our study we calculated other statistical parameters. The rate of biofilm detection was 45.6%, 39.3% and 11% each by TCP, TM, CRA and MCRA methods, respectively. The difference between TCP and only CRA/MCRA was significant, but not that between TCP and TM. There was no difference in the rate of biofilm detection between CRA and MCRA in other isolates, but MCRA is superior to CRA for detection of the staphylococcal biofilm formation. TCP method is the ideal method for detection of bacterial biofilm formation by uropathogens. MCRA method is superior only to CRA for detection of staphylococcal biofilm formation.

  7. A spatial scan statistic for nonisotropic two-level risk cluster.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2012-01-30

    Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster. Copyright © 2011 John Wiley & Sons, Ltd.

  8. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  9. Generalized Linear Models of Home Activity for Automatic Detection of Mild Cognitive Impairment in Older Adults*

    PubMed Central

    Akl, Ahmad; Snoek, Jasper; Mihailidis, Alex

    2015-01-01

    With a globally aging population, the burden of care of cognitively impaired older adults is becoming increasingly concerning. Instances of Alzheimer’s disease and other forms of dementia are becoming ever more frequent. Earlier detection of cognitive impairment offers significant benefits, but remains difficult to do in practice. In this paper, we develop statistical models of the behavior of older adults within their homes using sensor data in order to detect the early onset of cognitive decline. Specifically, we use inhomogenous Poisson processes to model the presence of subjects within different rooms throughout the day in the home using unobtrusive sensing technologies. We compare the distributions learned from cognitively intact and impaired subjects using information theoretic tools and observe statistical differences between the two populations which we believe can be used to help detect the onset of cognitive decline. PMID:25570050

  10. Generalized Linear Models of home activity for automatic detection of mild cognitive impairment in older adults.

    PubMed

    Akl, Ahmad; Snoek, Jasper; Mihailidis, Alex

    2014-01-01

    With a globally aging population, the burden of care of cognitively impaired older adults is becoming increasingly concerning. Instances of Alzheimer's disease and other forms of dementia are becoming ever more frequent. Earlier detection of cognitive impairment offers significant benefits, but remains difficult to do in practice. In this paper, we develop statistical models of the behavior of older adults within their homes using sensor data in order to detect the early onset of cognitive decline. Specifically, we use inhomogenous Poisson processes to model the presence of subjects within different rooms throughout the day in the home using unobtrusive sensing technologies. We compare the distributions learned from cognitively intact and impaired subjects using information theoretic tools and observe statistical differences between the two populations which we believe can be used to help detect the onset of cognitive decline.

  11. Incorporating signal-dependent noise for hyperspectral target detection

    NASA Astrophysics Data System (ADS)

    Morman, Christopher J.; Meola, Joseph

    2015-05-01

    The majority of hyperspectral target detection algorithms are developed from statistical data models employing stationary background statistics or white Gaussian noise models. Stationary background models are inaccurate as a result of two separate physical processes. First, varying background classes often exist in the imagery that possess different clutter statistics. Many algorithms can account for this variability through the use of subspaces or clustering techniques. The second physical process, which is often ignored, is a signal-dependent sensor noise term. For photon counting sensors that are often used in hyperspectral imaging systems, sensor noise increases as the measured signal level increases as a result of Poisson random processes. This work investigates the impact of this sensor noise on target detection performance. A linear noise model is developed describing sensor noise variance as a linear function of signal level. The linear noise model is then incorporated for detection of targets using data collected at Wright Patterson Air Force Base.

  12. Robust detection of multiple sclerosis lesions from intensity-normalized multi-channel MRI

    NASA Astrophysics Data System (ADS)

    Karpate, Yogesh; Commowick, Olivier; Barillot, Christian

    2015-03-01

    Multiple sclerosis (MS) is a disease with heterogeneous evolution among the patients. Quantitative analysis of longitudinal Magnetic Resonance Images (MRI) provides a spatial analysis of the brain tissues which may lead to the discovery of biomarkers of disease evolution. Better understanding of the disease will lead to a better discovery of pathogenic mechanisms, allowing for patient-adapted therapeutic strategies. To characterize MS lesions, we propose a novel paradigm to detect white matter lesions based on a statistical framework. It aims at studying the benefits of using multi-channel MRI to detect statistically significant differences between each individual MS patient and a database of control subjects. This framework consists in two components. First, intensity standardization is conducted to minimize the inter-subject intensity difference arising from variability of the acquisition process and different scanners. The intensity normalization maps parameters obtained using a robust Gaussian Mixture Model (GMM) estimation not affected by the presence of MS lesions. The second part studies the comparison of multi-channel MRI of MS patients with respect to an atlas built from the control subjects, thereby allowing us to look for differences in normal appearing white matter, in and around the lesions of each patient. Experimental results demonstrate that our technique accurately detects significant differences in lesions consequently improving the results of MS lesion detection.

  13. Spatial scan statistics for detection of multiple clusters with arbitrary shapes.

    PubMed

    Lin, Pei-Sheng; Kung, Yi-Hung; Clayton, Murray

    2016-12-01

    In applying scan statistics for public health research, it would be valuable to develop a detection method for multiple clusters that accommodates spatial correlation and covariate effects in an integrated model. In this article, we connect the concepts of the likelihood ratio (LR) scan statistic and the quasi-likelihood (QL) scan statistic to provide a series of detection procedures sufficiently flexible to apply to clusters of arbitrary shape. First, we use an independent scan model for detection of clusters and then a variogram tool to examine the existence of spatial correlation and regional variation based on residuals of the independent scan model. When the estimate of regional variation is significantly different from zero, a mixed QL estimating equation is developed to estimate coefficients of geographic clusters and covariates. We use the Benjamini-Hochberg procedure (1995) to find a threshold for p-values to address the multiple testing problem. A quasi-deviance criterion is used to regroup the estimated clusters to find geographic clusters with arbitrary shapes. We conduct simulations to compare the performance of the proposed method with other scan statistics. For illustration, the method is applied to enterovirus data from Taiwan. © 2016, The International Biometric Society.

  14. Detection of nonauthorized genetically modified organisms using differential quantitative polymerase chain reaction: application to 35S in maize.

    PubMed

    Cankar, Katarina; Chauvensy-Ancel, Valérie; Fortabat, Marie-Noelle; Gruden, Kristina; Kobilinsky, André; Zel, Jana; Bertheau, Yves

    2008-05-15

    Detection of nonauthorized genetically modified organisms (GMOs) has always presented an analytical challenge because the complete sequence data needed to detect them are generally unavailable although sequence similarity to known GMOs can be expected. A new approach, differential quantitative polymerase chain reaction (PCR), for detection of nonauthorized GMOs is presented here. This method is based on the presence of several common elements (e.g., promoter, genes of interest) in different GMOs. A statistical model was developed to study the difference between the number of molecules of such a common sequence and the number of molecules identifying the approved GMO (as determined by border-fragment-based PCR) and the donor organism of the common sequence. When this difference differs statistically from zero, the presence of a nonauthorized GMO can be inferred. The interest and scope of such an approach were tested on a case study of different proportions of genetically modified maize events, with the P35S promoter as the Cauliflower Mosaic Virus common sequence. The presence of a nonauthorized GMO was successfully detected in the mixtures analyzed and in the presence of (donor organism of P35S promoter). This method could be easily transposed to other common GMO sequences and other species and is applicable to other detection areas such as microbiology.

  15. Statistical methods for detecting and comparing periodic data and their application to the nycthemeral rhythm of bodily harm: A population based study

    PubMed Central

    2010-01-01

    Background Animals, including humans, exhibit a variety of biological rhythms. This article describes a method for the detection and simultaneous comparison of multiple nycthemeral rhythms. Methods A statistical method for detecting periodic patterns in time-related data via harmonic regression is described. The method is particularly capable of detecting nycthemeral rhythms in medical data. Additionally a method for simultaneously comparing two or more periodic patterns is described, which derives from the analysis of variance (ANOVA). This method statistically confirms or rejects equality of periodic patterns. Mathematical descriptions of the detecting method and the comparing method are displayed. Results Nycthemeral rhythms of incidents of bodily harm in Middle Franconia are analyzed in order to demonstrate both methods. Every day of the week showed a significant nycthemeral rhythm of bodily harm. These seven patterns of the week were compared to each other revealing only two different nycthemeral rhythms, one for Friday and Saturday and one for the other weekdays. PMID:21059197

  16. MR Spectroscopy to Distinguish between Supratentorial Intraventricular Subependymoma and Central Neurocytoma.

    PubMed

    Ueda, Fumiaki; Aburano, Hiroyuki; Ryu, Yasuji; Yoshie, Yuichi; Nakada, Mitsutoshi; Hayashi, Yutaka; Matsui, Osamu; Gabata, Toshifumi

    2017-07-10

    The purpose of this study was to discriminate supratentorial intraventricular subependymoma (SIS) from central neurocytoma (CNC) using magnetic resonance spectroscopy (MRS). Single-voxel proton MRS using a 1.5T or 3T MR scanner from five SISs, five CNCs, and normal controls were evaluated. They were examined using a point-resolved spectroscopy. Automatically calculated ratios comparing choline (Cho), N-acetylaspartate (NAA), myoinositol (MI), and/or glycine (Gly) to creatine (Cr) were determined. Evaluation of Cr to unsuppressed water (USW) was also performed. Mann-Whitney U test was carried out to test the significance of differences in the metabolite ratios. Detectability of lactate (Lac) and alanine (Ala) was evaluated. Although a statistically significant difference (P < 0.0001) was observed in Cho/Cr among SIS, control spectra, and CNC, no statistical difference was noted between SIS and control spectra (P = 0.11). Statistically significant differences were observed in NAA/Cr between SIS and CNC (P = 0.04) or control spectra (P < 0.0001). A statistically significant difference was observed in MI and/or Gly to Cr between SIS and control spectra (P = 0.03), and CNC and control spectra (P < 0.0006). There were no statistical differences between SIS and CNC for MI and/or Gly to Cr (P = 0.32). Significant statistical differences were found between SIS and control spectra (P < 0.0053), control spectra and CNC (P < 0.0016), and SIS and CNC (P < 0.0083) for Cr to USW. Lac inverted doublets were confirmed in two SISs. Triplets of Lac and Ala were detected in four spectra of CNC. The present study showed that MRS can be useful in discriminating SIS from CNC.

  17. Statistics of Dark Matter Halos from Gravitational Lensing.

    PubMed

    Jain; Van Waerbeke L

    2000-02-10

    We present a new approach to measure the mass function of dark matter halos and to discriminate models with differing values of Omega through weak gravitational lensing. We measure the distribution of peaks from simulated lensing surveys and show that the lensing signal due to dark matter halos can be detected for a wide range of peak heights. Even when the signal-to-noise ratio is well below the limit for detection of individual halos, projected halo statistics can be constrained for halo masses spanning galactic to cluster halos. The use of peak statistics relies on an analytical model of the noise due to the intrinsic ellipticities of source galaxies. The noise model has been shown to accurately describe simulated data for a variety of input ellipticity distributions. We show that the measured peak distribution has distinct signatures of gravitational lensing, and its non-Gaussian shape can be used to distinguish models with different values of Omega. The use of peak statistics is complementary to the measurement of field statistics, such as the ellipticity correlation function, and is possibly not susceptible to the same systematic errors.

  18. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  19. Identification of Major Histocompatibility Complex-Regulated Body Odorants by Statistical Analysis of a Comparative Gas Chromatography/Mass Spectrometry Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willse, Alan R.; Belcher, Ann; Preti, George

    2005-04-15

    Gas chromatography (GC), combined with mass spectrometry (MS) detection, is a powerful analytical technique that can be used to separate, quantify, and identify volatile compounds in complex mixtures. This paper examines the application of GC-MS in a comparative experiment to identify volatiles that differ in concentration between two groups. A complex mixture might comprise several hundred or even thousands of volatile compounds. Because their number and location in a chromatogram generally are unknown, and because components overlap in populous chromatograms, the statistical problems offer significant challenges beyond traditional two-group screening procedures. We describe a statistical procedure to compare two-dimensional GC-MSmore » profiles between groups, which entails (1) signal processing: baseline correction and peak detection in single ion chromatograms; (2) aligning chromatograms in time; (3) normalizing differences in overall signal intensities; and (4) detecting chromatographic regions that differ between groups. Compared to existing approaches, the proposed method is robust to errors made at earlier stages of analysis, such as missed peaks or slightly misaligned chromatograms. To illustrate the method, we identify differences in GC-MS chromatograms of ether-extracted urine collected from two nearly identical inbred groups of mice, to investigate the relationship between odor and genetics of the major histocompatibility complex.« less

  20. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system

    PubMed Central

    Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J.; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method’s implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System’s C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis. PMID:28886112

  2. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system.

    PubMed

    Mathes, Robert W; Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method's implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System's C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis.

  3. [The prognostic value of cerebral oxygen saturation measurement for assessing prognosis after cardiopulmonary resuscitation].

    PubMed

    Inal, Mehmet Turan; Memiş, Dilek; Yıldırım, Ilker; Uğur, Hüseyin; Erkaymaz, Aysegul; Turan, F Nesrin

    Despite new improvements on cardiopulmonary resuscitation (CPR), brain damage is very often after resuscitation. To assess the prognostic value of cerebral oxygen saturation measurement (rSO 2 ) for assessing prognosis on patients after cardiopulmonary resuscitation. Retrospective analysis. We analyzed 25 post-CPR patients (12 female and 13 male). All the patients were cooled to a target temperature of 33-34°C. The Glascow Coma Scale (GCS), Corneal Reflexes (CR), Pupillary Reflexes (PR), arterial Base Excess (BE) and rSO 2 measurements were taken on admission. The rewarming GCS, CR, PR, BE and rSO 2 measurements were made after the patient's temperature reached 36°C. In survivors, the baseline rSO 2 value was 67.5 (46-70) and the percent difference between baseline and rewarming rSO 2 value was 0.03 (0.014-0.435). In non-survivors, the baseline rSO 2 value was 30 (25-65) and the percent difference between baseline and rewarming rSO 2 value was 0.031 (-0.08 to -20). No statistical difference was detected on percent changes between baseline and rewarming values of rSO 2. Statistically significant difference was detected between baseline and rewarming GCS groups (p=0.004). No statistical difference was detected between GCS, CR, PR, BE and rSO 2 to determine the prognosis. Despite higher values of rSO 2 on survivors than non-survivors, we found no statistically considerable difference between groups on baseline and the rewarming rSO 2 values. Since the measurement is simple, and not affected by hypotension and hypothermia, the rSO 2 may be a useful predictor for determining the prognosis after CPR. Copyright © 2016 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.

  4. Data processing of qualitative results from an interlaboratory comparison for the detection of “Flavescence dorée” phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology

    PubMed Central

    Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of “Flavescence dorée” (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes’ theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods. PMID:28384335

  5. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    PubMed

    Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods.

  6. Reanalysis of Tyrannosaurus rex Mass Spectra.

    PubMed

    Bern, Marshall; Phinney, Brett S; Goldberg, David

    2009-09-01

    Asara et al. reported the detection of collagen peptides in a 68-million-year-old Tyrannosaurus rex bone by shotgun proteomics. This finding has been called into question as a possible statistical artifact. We reanalyze Asara et al.'s tandem mass spectra using a different search engine and different statistical tools. Our reanalysis shows a sample containing common laboratory contaminants, soil bacteria, and bird-like hemoglobin and collagen.

  7. Evidence for speckle effects on pulsed CO2 lidar signal returns from remote targets

    NASA Technical Reports Server (NTRS)

    Menzies, R. T.; Kavaya, M. J.; Flamant, P. H.

    1984-01-01

    A pulsed CO2 lidar was used to study statistical properties of signal returns from various rough surfaces at distances near 2 km. These included natural in situ topographic materials as well as man-made hard targets. Three lidar configurations were used: heterodyne detection with single temporal mode transmitter pulses, and direct detection with single and multiple temporal mode pulses. The significant differences in signal return statistics, due largely to speckle effects, are discussed.

  8. Detection of Tampering Inconsistencies on Mobile Photos

    NASA Astrophysics Data System (ADS)

    Cao, Hong; Kot, Alex C.

    Fast proliferation of mobile cameras and the deteriorating trust on digital images have created needs in determining the integrity of photos captured by mobile devices. As tampering often creates some inconsistencies, we propose in this paper a novel framework to statistically detect the image tampering inconsistency using accurately detected demosaicing weights features. By first cropping four non-overlapping blocks, each from one of the four quadrants in the mobile photo, we extract a set of demosaicing weights features from each block based on a partial derivative correlation model. Through regularizing the eigenspectrum of the within-photo covariance matrix and performing eigenfeature transformation, we further derive a compact set of eigen demosaicing weights features, which are sensitive to image signal mixing from different photo sources. A metric is then proposed to quantify the inconsistency based on the eigen weights features among the blocks cropped from different regions of the mobile photo. Through comparison, we show our eigen weights features perform better than the eigen features extracted from several other conventional sets of statistical forensics features in detecting the presence of tampering. Experimentally, our method shows a good confidence in tampering detection especially when one of the four cropped blocks is from a different camera model or brand with different demosaicing process.

  9. Sister chromatid exchanges and micronuclei analysis in lymphocytes of men exposed to simazine through drinking water.

    PubMed

    Suárez, Susanna; Rubio, Arantxa; Sueiro, Rosa Ana; Garrido, Joaquín

    2003-06-06

    In some cities of the autonomous community of Extremadura (south-west of Spain), levels of simazine from 10 to 30 ppm were detected in tap water. To analyse the possible effect of this herbicide, two biomarkers, sister chromatid exchanges (SCE) and micronuclei (MN), were used in peripheral blood lymphocytes from males exposed to simazine through drinking water. SCE and MN analysis failed to detect any statistically significant increase in the people exposed to simazine when compared with the controls. With respect to high frequency cells (HFC), a statistically significant difference was detected between exposed and control groups.

  10. Diagnostic Accuracy of Computer Tomography Angiography and Magnetic Resonance Angiography in the Stenosis Detection of Autologuous Hemodialysis Access: A Meta-Analysis

    PubMed Central

    Liu, Shiyuan

    2013-01-01

    Purpose To compare the diagnostic performances of computer tomography angiography (CTA) and magnetic resonance angiography (MRA) for detection and assessment of stenosis in patients with autologuous hemodialysis access. Materials and Methods Search of PubMed, MEDLINE, EMBASE and Cochrane Library database from January 1984 to May 2013 for studies comparing CTA or MRA with DSA or surgery for autologuous hemodialysis access. Eligible studies were in English language, aimed to detect more than 50% stenosis or occlusion of autologuous vascular access in hemodialysis patients with CTA and MRA technology and provided sufficient data about diagnosis performance. Methodological quality was assessed by the Quality Assessment of Diagnostic Studies (QUADAS) instrument. Sensitivities (SEN), specificities (SPE), positive likelihood ratio (PLR), negative likelihood values (NLR), diagnostic odds ratio (DOR) and areas under the receiver operator characteristic curve (AUC) were pooled statistically. Potential threshold effect, heterogeneity and publication bias was evaluated. The clinical utility of CTA and MRA in detection of stenosis was also investigated. Result Sixteen eligible studies were included, with a total of 500 patients. Both CTA and MRA were accurate modality (sensitivity, 96.2% and 95.4%, respectively; specificity, 97.1 and 96.1%, respectively; DOR [diagnostic odds ratio], 393.69 and 211.47, respectively) for hemodialysis vascular access. No significant difference was detected between the diagnostic performance of CTA (AUC, 0.988) and MRA (AUC, 0.982). Meta-regression analyses and subgroup analyses revealed no statistical difference. The Deek’s funnel plots suggested a publication bias. Conclusion Diagnostic performance of CTA and MRA for detecting stenosis of hemodialysis vascular access had no statistical difference. Both techniques may function as an alternative or an important complement to conventional digital subtraction angiography (DSA) and may be able to help guide medical management. PMID:24194928

  11. Reanalysis of Tyrannosaurus rex Mass Spectra

    PubMed Central

    Bern, Marshall; Phinney, Brett S.; Goldberg, David

    2009-01-01

    Asara et al. reported the detection of collagen peptides in a 68-million-year-old T. rex bone by shotgun proteomics. This finding has been called into question as a possible statistical artifact. We reanalyze Asara et al.'s tandem mass spectra using a different search engine and different statistical tools. Our reanalysis shows a sample containing common laboratory contaminants, soil bacteria, and bird-like hemoglobin and collagen. PMID:19603827

  12. Detection of circulating tumor cells using oHSV1-hTERT-GFP in lung cancer.

    PubMed

    Gao, Hongjun; Liu, Wenjing; Yang, Shaoxing; Zhang, Wen; Li, Xiaoyan; Qin, Haifeng; Wang, Weixia; Zhao, Changyun

    2018-01-01

    This study was conducted to evaluate the clinical utility of the oHSV1-hTERT-GFP circulating tumor cell (CTC) detection method in the peripheral blood of patients with lung cancer by comparing its sensitivity to the CellSearch CTC detection method. The oHSV1-hTERT-GFP and CellSearch CTC detection methods were compared using peripheral blood samples of patients pathologically diagnosed with lung cancer. A total of 240 patients with lung cancer were recruited, including 89 patients who were newly diagnosed and 151 patients who had previously received treatment. Sixty-six newly diagnosed patients were evaluated using both methods. The CTC detection rates were 71.2% and 33.3% using the oHSV1-hTERT-GFP and CellSearch methods, respectively; this difference was statistically significant (P = 0.000). Among the entire cohort (n = 240), the CTC detection rate using the oHSV1-hTERT-GFP method was 76.3%, with a CTC count of 0-81. The CTC detection rates were 76.7%, 68.9%, and 76.3% in patients with squamous cell carcinoma, adenocarcinoma, and small cell lung cancer, respectively. There was no statistically significant difference in the CTC detection rates between these different pathological subtypes (P = 0.738). The CTC detection rates of 79.8% and 74.4% in patients with stage I-III and IV lung cancer, respectively, were not significantly different (P = 0.427). The oHSV1-hTERT-GFP method is highly effective for detecting CTCs in patients with lung cancer, independent of pathological type and disease stage, and is ideal for large-scale clinical applications. © 2017 The Authors. Thoracic Cancer published by China Lung Oncology Group and John Wiley & Sons Australia, Ltd.

  13. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  14. Built-Up Area Detection from High-Resolution Satellite Images Using Multi-Scale Wavelet Transform and Local Spatial Statistics

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Zhang, Y.; Gao, J.; Yuan, Y.; Lv, Z.

    2018-04-01

    Recently, built-up area detection from high-resolution satellite images (HRSI) has attracted increasing attention because HRSI can provide more detailed object information. In this paper, multi-resolution wavelet transform and local spatial autocorrelation statistic are introduced to model the spatial patterns of built-up areas. First, the input image is decomposed into high- and low-frequency subbands by wavelet transform at three levels. Then the high-frequency detail information in three directions (horizontal, vertical and diagonal) are extracted followed by a maximization operation to integrate the information in all directions. Afterward, a cross-scale operation is implemented to fuse different levels of information. Finally, local spatial autocorrelation statistic is introduced to enhance the saliency of built-up features and an adaptive threshold algorithm is used to achieve the detection of built-up areas. Experiments are conducted on ZY-3 and Quickbird panchromatic satellite images, and the results show that the proposed method is very effective for built-up area detection.

  15. A study of two unsupervised data driven statistical methodologies for detecting and classifying damages in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Tibaduiza, D.-A.; Torres-Arredondo, M.-A.; Mujica, L. E.; Rodellar, J.; Fritzen, C.-P.

    2013-12-01

    This article is concerned with the practical use of Multiway Principal Component Analysis (MPCA), Discrete Wavelet Transform (DWT), Squared Prediction Error (SPE) measures and Self-Organizing Maps (SOM) to detect and classify damages in mechanical structures. The formalism is based on a distributed piezoelectric active sensor network for the excitation and detection of structural dynamic responses. Statistical models are built using PCA when the structure is known to be healthy either directly from the dynamic responses or from wavelet coefficients at different scales representing Time-frequency information. Different damages on the tested structures are simulated by adding masses at different positions. The data from the structure in different states (damaged or not) are then projected into the different principal component models by each actuator in order to obtain the input feature vectors for a SOM from the scores and the SPE measures. An aircraft fuselage from an Airbus A320 and a multi-layered carbon fiber reinforced plastic (CFRP) plate are used as examples to test the approaches. Results are presented, compared and discussed in order to determine their potential in structural health monitoring. These results showed that all the simulated damages were detectable and the selected features proved capable of separating all damage conditions from the undamaged state for both approaches.

  16. Evaluation of the 3M™ Molecular Detection Assay (MDA) 2 - Salmonella for the Detection of Salmonella spp. in Select Foods and Environmental Surfaces: Collaborative Study, First Action 2016.01.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James R; Goins, David; Monteroso, Lisa

    2016-07-01

    The 3M™ Molecular Detection Assay (MDA) 2 - Salmonella uses real-time isothermal technology for the rapid and accurate detection of Salmonella spp. from enriched select food, feed, and food-process environmental samples. The 3M MDA 2 - Salmonella was evaluated in a multilaboratory collaborative study using an unpaired study design. The 3M MDA 2 - Salmonella was compared to the U.S. Food and Drug Administration Bacteriological Analytical Manual Chapter 5 reference method for the detection of Salmonella in creamy peanut butter, and to the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook Chapter 4.08 reference method "Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg and Catfish Products and Carcass and Environmental Samples" for the detection of Salmonella in raw ground beef (73% lean). Technicians from 16 laboratories located within the continental United States participated. Each matrix was evaluated at three levels of contamination: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). Statistical analysis was conducted according to the probability of detection (POD) statistical model. Results obtained for the low inoculum level test portions produced difference in collaborator POD values of 0.03 (95% confidence interval, -0.10 to 0.16) for raw ground beef and 0.06 (95% confidence interval, -0.06 to 0.18) for creamy peanut butter, indicating no statistically significant difference between the candidate and reference methods.

  17. Reproducible detection of disease-associated markers from gene expression data.

    PubMed

    Omae, Katsuhiro; Komori, Osamu; Eguchi, Shinto

    2016-08-18

    Detection of disease-associated markers plays a crucial role in gene screening for biological studies. Two-sample test statistics, such as the t-statistic, are widely used to rank genes based on gene expression data. However, the resultant gene ranking is often not reproducible among different data sets. Such irreproducibility may be caused by disease heterogeneity. When we divided data into two subsets, we found that the signs of the two t-statistics were often reversed. Focusing on such instability, we proposed a sign-sum statistic that counts the signs of the t-statistics for all possible subsets. The proposed method excludes genes affected by heterogeneity, thereby improving the reproducibility of gene ranking. We compared the sign-sum statistic with the t-statistic by a theoretical evaluation of the upper confidence limit. Through simulations and applications to real data sets, we show that the sign-sum statistic exhibits superior performance. We derive the sign-sum statistic for getting a robust gene ranking. The sign-sum statistic gives more reproducible ranking than the t-statistic. Using simulated data sets we show that the sign-sum statistic excludes hetero-type genes well. Also for the real data sets, the sign-sum statistic performs well in a viewpoint of ranking reproducibility.

  18. Incorporation of operator knowledge for improved HMDS GPR classification

    NASA Astrophysics Data System (ADS)

    Kennedy, Levi; McClelland, Jessee R.; Walters, Joshua R.

    2012-06-01

    The Husky Mine Detection System (HMDS) detects and alerts operators to potential threats observed in groundpenetrating RADAR (GPR) data. In the current system architecture, the classifiers have been trained using available data from multiple training sites. Changes in target types, clutter types, and operational conditions may result in statistical differences between the training data and the testing data for the underlying features used by the classifier, potentially resulting in an increased false alarm rate or a lower probability of detection for the system. In the current mode of operation, the automated detection system alerts the human operator when a target-like object is detected. The operator then uses data visualization software, contextual information, and human intuition to decide whether the alarm presented is an actual target or a false alarm. When the statistics of the training data and the testing data are mismatched, the automated detection system can overwhelm the analyst with an excessive number of false alarms. This is evident in the performance of and the data collected from deployed systems. This work demonstrates that analyst feedback can be successfully used to re-train a classifier to account for variable testing data statistics not originally captured in the initial training data.

  19. A new statistical approach to climate change detection and attribution

    NASA Astrophysics Data System (ADS)

    Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe

    2017-01-01

    We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).

  20. [Detection rate analysis on neurological sign of workers exposed to different concentrations of carbon disulfide].

    PubMed

    Li, Kuirong; Zhou, Wenhui; Gu, Guizhen; Zhou, Shiyi; Zheng, Yuxin; Yu, Shanfa

    2014-10-01

    To study the effects of exposed to different concentrations of carbon disulfide on neurological signs of workers. Collection the information of concentration of carbon disulfide in the workplace or workers individuals exposed of a chemical fiber industry from 2004 to 2011, a total of 3 537 workers exposed to carbon disulfide were detected muscle strength and muscle tone, knee reflex, Achilles tendon reflex, trembling limbs, sensory function, and three chatter. Chi-square test was used for statistical analysis on abnormal neurological signs of workers. Eight hours time-weighted average concentration range of workers exposed to carbon disulfide in this chemical fiber industry was 0.2-41.0 mg/m(3), geometric mean was 2.38 mg/m(3). Concentration of carbon disulfide exposure of 1 771 workers was from 0.2 to 2.5 mg/m3( ≤ 2.5 mg/m(3)), 642 workers was 2.6-4.8 mg/m(3) (< 5.0 mg/m(3)), other 1 051 workers was from 5.1 to 41.0 mg/m(3) ( > 5.0 mg/m(3)) in all subjects. The different detection rates of knee reflex were 3.0% (31/1 045), 3.7% (21/574), 4.8% (16/331), 3.3% (10/305), 5.9% (11/187), 6.7% (68/1 022), the different detection rates of Achilles tendon reflex were 2.2% (23/1 045), 3.7% (21/574), 2.7% (9/331), 2.3% (7/305), 2.1% (4/187), 5.6% (57/1 022), the different detection rates of sensory dysfunction were 0.4% (4/1 045), 0.5% (3/574), 0.6% (2/331), 0.0% (0/305), 2.1% (4/187), 1.7% (17/1 022) in different cumulative amount of contact groups ( ≤ 10.0, 10.1-20.0, 20.1-30.0, 30.1-40.0, 40.1-50.0, >50.0 mg/m(3) per year), and the differences were statistically significant (χ(2) = 19.53, 21.27 and 15.89, all P values were <0.01) . Stratified according to age and gender, in addition to the ≤ 25 years group the difference of detection rate analysis on Achilles tendon reflex was statistically significant in the different concentration group (the ratio of on Achilles tendon reflex in the different groups of concentration of carbon disulfide exposure of 2.5, 2.6-5.0, ≥ 5.0 mg/m(3) were 0.4% (2/511), 1.0% (1/98), 2.1% (7/327), χ(2) = 5.59, P = 0.045) , the difference of detection rate analysis on neurological sign was not statistically significant in the different concentration group on the rest of the age and gender groups (P > 0.05). Within the concentration range of the object of study contact actual, different concentrations of carbon disulfide in addition to individual neurological signs of individual ages influential, it has no significant effect on the various signs of nervous system of workers of most age and gender groups, expect the age below the 25 years old group.

  1. The application of the statistical classifying models for signal evaluation of the gas sensors analyzing mold contamination of the building materials

    NASA Astrophysics Data System (ADS)

    Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk

    2017-07-01

    Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.

  2. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    PubMed Central

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  3. Detection of Person Misfit in Computerized Adaptive Tests with Polytomous Items.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    2002-01-01

    Compared the nominal and empirical null distributions of the standardized log-likelihood statistic for polytomous items for paper-and-pencil (P&P) and computerized adaptive tests (CATs). Results show that the empirical distribution of the statistic differed from the assumed standard normal distribution for both P&P tests and CATs. Also…

  4. A spatial scan statistic for survival data based on Weibull distribution.

    PubMed

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Statistical fingerprinting for malware detection and classification

    DOEpatents

    Prowell, Stacy J.; Rathgeb, Christopher T.

    2015-09-15

    A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.

  6. Electron microscopic quantification of collagen fibril diameters in the rabbit medial collateral ligament: a baseline for comparison.

    PubMed

    Frank, C; Bray, D; Rademaker, A; Chrusch, C; Sabiston, P; Bodie, D; Rangayyan, R

    1989-01-01

    To establish a normal baseline for comparison, thirty-one thousand collagen fibril diameters were measured in calibrated transmission electron (TEM) photomicrographs of normal rabbit medial collateral ligaments (MCL's). A new automated method of quantitation was used to compare statistically fibril minimum diameter distributions in one midsubstance location in both MCL's from six animals at 3 months of age (immature) and three animals at 10 months of age (mature). Pooled results demonstrate that rabbit MCL's have statistically different (p less than 0.001) mean minimum diameters at these two ages. Interanimal differences in mean fibril minimum diameters were also significant (p less than 0.001) and varied by 20% to 25% in both mature and immature animals. Finally, there were significant differences (p less than 0.001) in mean diameters and distributions from side-to-side in all animals. These mean left-to-right differences were less than 10% in all mature animals but as much as 62% in some immature animals. Statistical analysis of these data demonstrate that animal-to-animal comparisons using these protocols require a large number of animals with appropriate numbers of fibrils being measured to detect small intergroup differences. With experiments which compare left to right ligaments, far fewer animals are required to detect similarly small differences. These results demonstrate the necessity for rigorous control of sampling, an extensive normal baseline and statistically confirmed experimental designs in any TEM comparisons of collagen fibril diameters.

  7. Cross-modality PET/CT and contrast-enhanced CT imaging for pancreatic cancer

    PubMed Central

    Zhang, Jian; Zuo, Chang-Jing; Jia, Ning-Yang; Wang, Jian-Hua; Hu, Sheng-Ping; Yu, Zhong-Fei; Zheng, Yuan; Zhang, An-Yu; Feng, Xiao-Yuan

    2015-01-01

    AIM: To explore the diagnostic value of the cross-modality fusion images provided by positron emission tomography/computed tomography (PET/CT) and contrast-enhanced CT (CECT) for pancreatic cancer (PC). METHODS: Data from 70 patients with pancreatic lesions who underwent CECT and PET/CT examinations at our hospital from August 2010 to October 2012 were analyzed. PET/CECT for the cross-modality image fusion was obtained using TureD software. The diagnostic efficiencies of PET/CT, CECT and PET/CECT were calculated and compared with each other using a χ2 test. P < 0.05 was considered to indicate statistical significance. RESULTS: Of the total 70 patients, 50 had PC and 20 had benign lesions. The differences in the sensitivity, negative predictive value (NPV), and accuracy between CECT and PET/CECT in detecting PC were statistically significant (P < 0.05 for each). In 15 of the 31 patients with PC who underwent a surgical operation, peripancreatic vessel invasion was verified. The differences in the sensitivity, positive predictive value, NPV, and accuracy of CECT vs PET/CT and PET/CECT vs PET/CT in diagnosing peripancreatic vessel invasion were statistically significant (P < 0.05 for each). In 19 of the 31 patients with PC who underwent a surgical operation, regional lymph node metastasis was verified by postsurgical histology. There was no statistically significant difference among the three methods in detecting regional lymph node metastasis (P > 0.05 for each). In 17 of the 50 patients with PC confirmed by histology or clinical follow-up, distant metastasis was confirmed. The differences in the sensitivity and NPV between CECT and PET/CECT in detecting distant metastasis were statistically significant (P < 0.05 for each). CONCLUSION: Cross-modality image fusion of PET/CT and CECT is a convenient and effective method that can be used to diagnose and stage PC, compensating for the defects of PET/CT and CECT when they are conducted individually. PMID:25780297

  8. Turbulent/non-turbulent interfaces detected in DNS of incompressible turbulent boundary layers

    NASA Astrophysics Data System (ADS)

    Watanabe, T.; Zhang, X.; Nagata, K.

    2018-03-01

    The turbulent/non-turbulent interface (TNTI) detected in direct numerical simulations is studied for incompressible, temporally developing turbulent boundary layers at momentum thickness Reynolds number Reθ ≈ 2000. The outer edge of the TNTI layer is detected as an isosurface of the vorticity magnitude with the threshold determined with the dependence of the turbulent volume on a threshold level. The spanwise vorticity magnitude and passive scalar are shown to be good markers of turbulent fluids, where the conditional statistics on a distance from the outer edge of the TNTI layer are almost identical to the ones obtained with the vorticity magnitude. Significant differences are observed for the conditional statistics between the TNTI detected by the kinetic energy and vorticity magnitude. A widely used grid setting determined solely from the wall unit results in an insufficient resolution in a streamwise direction in the outer region, whose influence is found for the geometry of the TNTI and vorticity jump across the TNTI layer. The present results suggest that the grid spacing should be similar for the streamwise and spanwise directions. Comparison of the TNTI layer among different flows requires appropriate normalization of the conditional statistics. Reference quantities of the turbulence near the TNTI layer are obtained with the average of turbulent fluids in the intermittent region. The conditional statistics normalized by the reference turbulence characteristics show good quantitative agreement for the turbulent boundary layer and planar jet when they are plotted against the distance from the outer edge of the TNTI layer divided by the Kolmogorov scale defined for turbulent fluids in the intermittent region.

  9. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    NASA Astrophysics Data System (ADS)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  10. Statistical methods for convergence detection of multi-objective evolutionary algorithms.

    PubMed

    Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J

    2009-01-01

    In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.

  11. ON COMPUTING UPPER LIMITS TO SOURCE INTENSITIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashyap, Vinay L.; Siemiginowska, Aneta; Van Dyk, David A.

    2010-08-10

    A common problem in astrophysics is determining how bright a source could be and still not be detected in an observation. Despite the simplicity with which the problem can be stated, the solution involves complicated statistical issues that require careful analysis. In contrast to the more familiar confidence bound, this concept has never been formally analyzed, leading to a great variety of often ad hoc solutions. Here we formulate and describe the problem in a self-consistent manner. Detection significance is usually defined by the acceptable proportion of false positives (background fluctuations that are claimed as detections, or Type I error),more » and we invoke the complementary concept of false negatives (real sources that go undetected, or Type II error), based on the statistical power of a test, to compute an upper limit to the detectable source intensity. To determine the minimum intensity that a source must have for it to be detected, we first define a detection threshold and then compute the probabilities of detecting sources of various intensities at the given threshold. The intensity that corresponds to the specified Type II error probability defines that minimum intensity and is identified as the upper limit. Thus, an upper limit is a characteristic of the detection procedure rather than the strength of any particular source. It should not be confused with confidence intervals or other estimates of source intensity. This is particularly important given the large number of catalogs that are being generated from increasingly sensitive surveys. We discuss, with examples, the differences between these upper limits and confidence bounds. Both measures are useful quantities that should be reported in order to extract the most science from catalogs, though they answer different statistical questions: an upper bound describes an inference range on the source intensity, while an upper limit calibrates the detection process. We provide a recipe for computing upper limits that applies to all detection algorithms.« less

  12. Coliphages as indicators of enteroviruses.

    PubMed Central

    Stetler, R E

    1984-01-01

    Coliphages were monitored in conjunction with indicator bacteria and enteroviruses in a drinking-water plant modified to reduce trihalomethane production. Coliphages could be detected in the source water by direct inoculation, and sufficient coliphages were detected in enterovirus concentrates to permit following the coliphage levels through different water treatment processes. The recovery efficiency by different filter types ranged from 1 to 53%. Statistical analysis of the data indicated that enterovirus isolates were better correlated with coliphages than with total coliforms, fecal coliforms, fecal streptococci, or standard plate count organisms. Coliphages were not detected in finished water. PMID:6093694

  13. Cracking the Language Code: Neural Mechanisms Underlying Speech Parsing

    PubMed Central

    McNealy, Kristin; Mazziotta, John C.; Dapretto, Mirella

    2013-01-01

    Word segmentation, detecting word boundaries in continuous speech, is a critical aspect of language learning. Previous research in infants and adults demonstrated that a stream of speech can be readily segmented based solely on the statistical and speech cues afforded by the input. Using functional magnetic resonance imaging (fMRI), the neural substrate of word segmentation was examined on-line as participants listened to three streams of concatenated syllables, containing either statistical regularities alone, statistical regularities and speech cues, or no cues. Despite the participants’ inability to explicitly detect differences between the speech streams, neural activity differed significantly across conditions, with left-lateralized signal increases in temporal cortices observed only when participants listened to streams containing statistical regularities, particularly the stream containing speech cues. In a second fMRI study, designed to verify that word segmentation had implicitly taken place, participants listened to trisyllabic combinations that occurred with different frequencies in the streams of speech they just heard (“words,” 45 times; “partwords,” 15 times; “nonwords,” once). Reliably greater activity in left inferior and middle frontal gyri was observed when comparing words with partwords and, to a lesser extent, when comparing partwords with nonwords. Activity in these regions, taken to index the implicit detection of word boundaries, was positively correlated with participants’ rapid auditory processing skills. These findings provide a neural signature of on-line word segmentation in the mature brain and an initial model with which to study developmental changes in the neural architecture involved in processing speech cues during language learning. PMID:16855090

  14. diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.

    PubMed

    Lun, Aaron T L; Smyth, Gordon K

    2015-08-19

    Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.

  15. Adaptive Locally Optimum Processing for Interference Suppression from Communication and Undersea Surveillance Signals

    DTIC Science & Technology

    1994-07-01

    1993. "Analysis of the 1730-1732. Track - Before - Detect Approach to Target Detection using Pixel Statistics", to appear in IEEE Transactions Scholz, J...large surveillance arrays. One approach to combining energy in different spatial cells is track - before - detect . References to examples appear in the next... track - before - detect problem. The results obtained are not expected to depend strongly on model details. In particular, the structure of the tracking

  16. Radiologists' confidence in detecting abnormalities on chest images and their subjective judgments of image quality

    NASA Astrophysics Data System (ADS)

    King, Jill L.; Gur, David; Rockette, Howard E.; Curtin, Hugh D.; Obuchowski, Nancy A.; Thaete, F. Leland; Britton, Cynthia A.; Metz, Charles E.

    1991-07-01

    The relationship between subjective judgments of image quality for the performance of specific detection tasks and radiologists' confidence level in arriving at correct diagnoses was investigated in two studies in which 12 readers, using a total of three different display environments, interpreted a series of 300 PA chest images. The modalities used were conventional films, laser-printed films, and high-resolution CRT display of digitized images. For the detection of interstitial disease, nodules, and pneumothoraces, there was no statistically significant correlation (Spearman rho) between subjective ratings of quality and radiologists' confidence in detecting these abnormalities. However, in each study, for all modalities and all readers but one, a small but statistically significant correlation was found between the radiologists' ability to correctly and confidently rule out interstitial disease and their subjective ratings of image quality.

  17. Detecting response of Douglas-fir plantations to urea fertilizer at three locations in the Oregon Coast Range.

    Treesearch

    Richard E. Miller; Jim Smith; Harry Anderson

    2001-01-01

    Fertilizer trials in coast Douglas-fir (Pseudotsuga menziesii var. menziesii (Mirb.) Franco) in the Oregon Coast Range usually indicate small and statistically nonsignificant response to nitrogen (N) fertilizers. Inherently weak experimental designs of past trials could make them too insensitive to detect growth differences...

  18. Aberrant Learning Achievement Detection Based on Person-Fit Statistics in Personalized e-Learning Systems

    ERIC Educational Resources Information Center

    Liu, Ming-Tsung; Yu, Pao-Ta

    2011-01-01

    A personalized e-learning service provides learning content to fit learners' individual differences. Learning achievements are influenced by cognitive as well as non-cognitive factors such as mood, motivation, interest, and personal styles. This paper proposes the Learning Caution Indexes (LCI) to detect aberrant learning patterns. The philosophy…

  19. Brain fingerprinting field studies comparing P300-MERMER and P300 brainwave responses in the detection of concealed information.

    PubMed

    Farwell, Lawrence A; Richardson, Drew C; Richardson, Graham M

    2013-08-01

    Brain fingerprinting detects concealed information stored in the brain by measuring brainwave responses. We compared P300 and P300-MERMER event-related brain potentials for error rate/accuracy and statistical confidence in four field/real-life studies. 76 tests detected presence or absence of information regarding (1) real-life events including felony crimes; (2) real crimes with substantial consequences (either a judicial outcome, i.e., evidence admitted in court, or a $100,000 reward for beating the test); (3) knowledge unique to FBI agents; and (4) knowledge unique to explosives (EOD/IED) experts. With both P300 and P300-MERMER, error rate was 0 %: determinations were 100 % accurate, no false negatives or false positives; also no indeterminates. Countermeasures had no effect. Median statistical confidence for determinations was 99.9 % with P300-MERMER and 99.6 % with P300. Brain fingerprinting methods and scientific standards for laboratory and field applications are discussed. Major differences in methods that produce different results are identified. Markedly different methods in other studies have produced over 10 times higher error rates and markedly lower statistical confidences than those of these, our previous studies, and independent replications. Data support the hypothesis that accuracy, reliability, and validity depend on following the brain fingerprinting scientific standards outlined herein.

  20. Temporal correlation measurements of pulsed dual CO2 lidar returns. [for atmospheric pollution detection

    NASA Technical Reports Server (NTRS)

    Menyuk, N.; Killinger, D. K.

    1981-01-01

    A pulsed dual-laser direct-detection differential-absorption lidar DIAL system, operating near 10.6 microns, is used to measure the temporal correlation and statistical properties of backscattered returns from specular and diffuse topographic targets. Results show that atmospheric-turbulence fluctuations can effectively be frozen for pulse separation times on the order of 1-3 msec or less. The diffuse target returns, however, yielded a much lower correlation than that obtained with the specular targets; this being due to uncorrelated system noise effects and different statistics for the two types of target returns.

  1. A comparison of change detection methods using multispectral scanner data

    USGS Publications Warehouse

    Seevers, Paul M.; Jones, Brenda K.; Qiu, Zhicheng; Liu, Yutong

    1994-01-01

    Change detection methods were investigated as a cooperative activity between the U.S. Geological Survey and the National Bureau of Surveying and Mapping, People's Republic of China. Subtraction of band 2, band 3, normalized difference vegetation index, and tasseled cap bands 1 and 2 data from two multispectral scanner images were tested using two sites in the United States and one in the People's Republic of China. A new statistical method also was tested. Band 2 subtraction gives the best results for detecting change from vegetative cover to urban development. The statistical method identifies areas that have changed and uses a fast classification algorithm to classify the original data of the changed areas by land cover type present for each image date.

  2. Kisspeptin levels in idiopathic hypogonadotropic hypogonadism diagnosed male patients and its relation with glucose-insulin dynamic.

    PubMed

    Öztin, Hasan; Çağıltay, Eylem; Çağlayan, Sinan; Kaplan, Mustafa; Akpak, Yaşam Kemal; Karaca, Nilay; Tığlıoğlu, Mesut

    2016-12-01

    Male hypogonadism is defined as the deficiency of testosterone or sperm production synthesized by testicles or the deficiency of both. The reasons for hypogonadism may be primary, meaning testicular or secondary, meaning hypothalamohypophyseal. In hypogonadotropic hypogonadism (HH), there is indeficiency in gonadotropic hormones due to hypothalamic or hypophyseal reasons. Gonadotropin-releasing hormone (GnRH) is an important stimulant in releasing follicular stimulant hormone (FSH), mainly luteinizing hormone (LH). GnRH omitted is under the effect of many hormonal or stimulating factors. Kisspeptin is present in many places of the body, mostly in hypothalamic anteroventral periventricular nucleus and arcuate nucleus. Kisspeptin has a suppressor effect on the metastasis of many tumors such as breast cancer and malign melanoma metastases, and is called "metastin" for this reason. Kisspeptin is a strong stimulant of GnRH. In idiopathic hypogonadotropic hypogonadism (IHH) etiology, there is gonadotropic hormone release indeficiency which cannot be clearly described. A total of 30 male hypogonatropic hypogonadism diagnosed patients over 30 years of age who have applied to Haydarpasa Education Hospital Endocrinology and Metabolic Diseases Service were included in the study. Compared to the control group, the effect of kisspeptin on male patients with hypogonatropic hypogonadism and on insulin resistance developing in hypogonadism patients was investigated in our study. A statistically significant difference was detected between average kisspeptin measurements of the groups (p < 0.01). Kisspeptin measurement of the cases in the patient group were detected significantly high. No statistically significant relation was detected among kisspeptin and LH/FSH levels. Although a positive low relation was detected between kisspeptin measurements of patient group cases and homeostasis model assessment of insulin resistance (HOMA-IR) measurements, this relation was statistically insignificant. When the patient and control groups were compared for HOMA-IR, no statistically significant difference was detected. The reason for high kisspeptin levels in the patient group compared to the control group makes us consider that there may be a GPR54 resistance or GnRH neuronal transfer pathway defect. When patients and control groups were compared for HOMA-IR, the difference was not statistically significant. It is considered that kisspeptin is one of the reasons for hypogonatropic hypogonadism and has less effect on insulin resistance.

  3. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    PubMed Central

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  4. A prospective study: Is handheld micropower impulse radar technology (Pneumoscan) a promising method to detect pneumothorax?

    PubMed

    Hocagil, Hilal; Hocagil, Abdullah Cüneyt; Karacabey, Sinan; Akkaya, Tuğba; Şimşek, Gözde; Sanrı, Erkman

    2015-09-01

    This study aimed to discuss the effectiveness of Pneumoscan working with micropower impulse radar (MIR) technology in diagnosing pneumothorax (PTX) in the emergency department. Patients with suspicion of PTX and indication for thorax tomography (CT) were included into the study. Findings of the Thorax CT were compared with the results of Pneumoscan. Chi-square and Fisher's exact tests were used in categorical variables. One hundred and fifteen patients were included into the study group; twelve patients presented with PTX diagnosed by CT, 10 of which were detected by Pneumoscan. Thirty-six true negative results, sixty-seven false positive results, and two false negative results were obtained, which resulted in an overall sensitivity of 83.3%, specificity of 35.0% for Pneumoscan. There was no statistically significant difference between the effectiveness of Pneumoscan and CT on the detection of PTX (p=0.33). There was no difference between the size of PTX diagnosed by CT and PTX diagnosed by Pneumoscan (p=0.47). There was no statistically significant difference between Pneumoscan and CT on detecting the localisation of the PTX (p=1.00). For the 10 cases diagnosed by Pneumoscan, mean chest wall thickness was determined as 50.3 mm while mean chest wall thickness for two false negatives diagnosed by Pneumoscan was 56.5 mm. However, no statistically significant difference was found between the chest wall thickness and the effectiveness of Pneumoscan on the detection of the PTX (p=0.77). Among sixty-seven false positives diagnosed by Pneumoscan, 46.3% had additional medical signs such as bronchiectasis, pulmonary consolidation, pulmonary edema or pulmonary tumor when they had a reading with CT. The relationship between having additional medical signs at the reading with CT and the effectiveness of Pneumoscan on the detection of the PTX was investigated and no significant difference was found (p=0.472). Using Pneumoscan to detect PTX is controversial since the device has a high false positive ratio. Wherein, false positive diagnosis can cause unjustifiable chest tube insertion. In addition, the device failed to show the size of the PTX, and therefore, it did not aid in determining the treatment and prognosis on contrary to traditional diagnostic methods. The findings could not demonstrate that the device was efficient in emergency care. Further studies and increasing experience may change this outcome in upcoming years.

  5. Efficient strategy for detecting gene × gene joint action and its application in schizophrenia.

    PubMed

    Won, Sungho; Kwon, Min-Seok; Mattheisen, Manuel; Park, Suyeon; Park, Changsoon; Kihara, Daisuke; Cichon, Sven; Ophoff, Roel; Nöthen, Markus M; Rietschel, Marcella; Baur, Max; Uitterlinden, Andre G; Hofmann, A; Lange, Christoph

    2014-01-01

    We propose a new approach to detect gene × gene joint action in genome-wide association studies (GWASs) for case-control designs. This approach offers an exhaustive search for all two-way joint action (including, as a special case, single gene action) that is computationally feasible at the genome-wide level and has reasonable statistical power under most genetic models. We found that the presence of any gene × gene joint action may imply differences in three types of genetic components: the minor allele frequencies and the amounts of Hardy-Weinberg disequilibrium may differ between cases and controls, and between the two genetic loci the degree of linkage disequilibrium may differ between cases and controls. Using Fisher's method, it is possible to combine the different sources of genetic information in an overall test for detecting gene × gene joint action. The proposed statistical analysis is efficient and its simplicity makes it applicable to GWASs. In the current study, we applied the proposed approach to a GWAS on schizophrenia and found several potential gene × gene interactions. Our application illustrates the practical advantage of the proposed method. © 2013 WILEY PERIODICALS, INC.

  6. A Third-Generation Adaptive Statistical Iterative Reconstruction Technique: Phantom Study of Image Noise, Spatial Resolution, Lesion Detectability, and Dose Reduction Potential.

    PubMed

    Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan

    2018-06-01

    The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.

  7. Cluster detection methods applied to the Upper Cape Cod cancer data.

    PubMed

    Ozonoff, Al; Webster, Thomas; Vieira, Veronica; Weinberg, Janice; Ozonoff, David; Aschengrau, Ann

    2005-09-15

    A variety of statistical methods have been suggested to assess the degree and/or the location of spatial clustering of disease cases. However, there is relatively little in the literature devoted to comparison and critique of different methods. Most of the available comparative studies rely on simulated data rather than real data sets. We have chosen three methods currently used for examining spatial disease patterns: the M-statistic of Bonetti and Pagano; the Generalized Additive Model (GAM) method as applied by Webster; and Kulldorff's spatial scan statistic. We apply these statistics to analyze breast cancer data from the Upper Cape Cancer Incidence Study using three different latency assumptions. The three different latency assumptions produced three different spatial patterns of cases and controls. For 20 year latency, all three methods generally concur. However, for 15 year latency and no latency assumptions, the methods produce different results when testing for global clustering. The comparative analyses of real data sets by different statistical methods provides insight into directions for further research. We suggest a research program designed around examining real data sets to guide focused investigation of relevant features using simulated data, for the purpose of understanding how to interpret statistical methods applied to epidemiological data with a spatial component.

  8. [Optimized application of nested PCR method for detection of malaria].

    PubMed

    Yao-Guang, Z; Li, J; Zhen-Yu, W; Li, C

    2017-04-28

    Objective To optimize the application of the nested PCR method for the detection of malaria according to the working practice, so as to improve the efficiency of malaria detection. Methods Premixing solution of PCR, internal primers for further amplification and new designed primers that aimed at two Plasmodium ovale subspecies were employed to optimize the reaction system, reaction condition and specific primers of P . ovale on basis of routine nested PCR. Then the specificity and the sensitivity of the optimized method were analyzed. The positive blood samples and examination samples of malaria were detected by the routine nested PCR and the optimized method simultaneously, and the detection results were compared and analyzed. Results The optimized method showed good specificity, and its sensitivity could reach the pg to fg level. The two methods were used to detect the same positive malarial blood samples simultaneously, the results indicated that the PCR products of the two methods had no significant difference, but the non-specific amplification reduced obviously and the detection rates of P . ovale subspecies improved, as well as the total specificity also increased through the use of the optimized method. The actual detection results of 111 cases of malarial blood samples showed that the sensitivity and specificity of the routine nested PCR were 94.57% and 86.96%, respectively, and those of the optimized method were both 93.48%, and there was no statistically significant difference between the two methods in the sensitivity ( P > 0.05), but there was a statistically significant difference between the two methods in the specificity ( P < 0.05). Conclusion The optimized PCR can improve the specificity without reducing the sensitivity on the basis of the routine nested PCR, it also can save the cost and increase the efficiency of malaria detection as less experiment links.

  9. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: I. Theory

    NASA Astrophysics Data System (ADS)

    Ren, W. X.; Lin, Y. Q.; Fang, S. E.

    2011-11-01

    One of the key issues in vibration-based structural health monitoring is to extract the damage-sensitive but environment-insensitive features from sampled dynamic response measurements and to carry out the statistical analysis of these features for structural damage detection. A new damage feature is proposed in this paper by using the system matrices of the forward innovation model based on the covariance-driven stochastic subspace identification of a vibrating system. To overcome the variations of the system matrices, a non-singularity transposition matrix is introduced so that the system matrices are normalized to their standard forms. For reducing the effects of modeling errors, noise and environmental variations on measured structural responses, a statistical pattern recognition paradigm is incorporated into the proposed method. The Mahalanobis and Euclidean distance decision functions of the damage feature vector are adopted by defining a statistics-based damage index. The proposed structural damage detection method is verified against one numerical signal and two numerical beams. It is demonstrated that the proposed statistics-based damage index is sensitive to damage and shows some robustness to the noise and false estimation of the system ranks. The method is capable of locating damage of the beam structures under different types of excitations. The robustness of the proposed damage detection method to the variations in environmental temperature is further validated in a companion paper by a reinforced concrete beam tested in the laboratory and a full-scale arch bridge tested in the field.

  10. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  11. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  12. A Different Shade Of Blue: An Evaluation Of The Civilian Detective Concept And Its Impact On Police Capabilities

    DTIC Science & Technology

    2016-03-01

    statistics from the California State Controller’s website, the Register reported that while inflation rose 27 percent from years 2003 to 2013, spending on...Recruitment and Retention for the New Millennium, 7. 24 Brian Reaves, Hiring and Retention of State and Local Law Enforcement Officers, 2008— Statistical ...Personnel, Policies, and Practices (NCJ 248677) (Washington, DC: Bureau of Justice Statistics , 2015), 3, http://www.bjs.gov/index.cfm?ty= pbdetail&iid

  13. Relationship between perceptual learning in speech and statistical learning in younger and older adults

    PubMed Central

    Neger, Thordis M.; Rietveld, Toni; Janse, Esther

    2014-01-01

    Within a few sentences, listeners learn to understand severely degraded speech such as noise-vocoded speech. However, individuals vary in the amount of such perceptual learning and it is unclear what underlies these differences. The present study investigates whether perceptual learning in speech relates to statistical learning, as sensitivity to probabilistic information may aid identification of relevant cues in novel speech input. If statistical learning and perceptual learning (partly) draw on the same general mechanisms, then statistical learning in a non-auditory modality using non-linguistic sequences should predict adaptation to degraded speech. In the present study, 73 older adults (aged over 60 years) and 60 younger adults (aged between 18 and 30 years) performed a visual artificial grammar learning task and were presented with 60 meaningful noise-vocoded sentences in an auditory recall task. Within age groups, sentence recognition performance over exposure was analyzed as a function of statistical learning performance, and other variables that may predict learning (i.e., hearing, vocabulary, attention switching control, working memory, and processing speed). Younger and older adults showed similar amounts of perceptual learning, but only younger adults showed significant statistical learning. In older adults, improvement in understanding noise-vocoded speech was constrained by age. In younger adults, amount of adaptation was associated with lexical knowledge and with statistical learning ability. Thus, individual differences in general cognitive abilities explain listeners' variability in adapting to noise-vocoded speech. Results suggest that perceptual and statistical learning share mechanisms of implicit regularity detection, but that the ability to detect statistical regularities is impaired in older adults if visual sequences are presented quickly. PMID:25225475

  14. Relationship between perceptual learning in speech and statistical learning in younger and older adults.

    PubMed

    Neger, Thordis M; Rietveld, Toni; Janse, Esther

    2014-01-01

    Within a few sentences, listeners learn to understand severely degraded speech such as noise-vocoded speech. However, individuals vary in the amount of such perceptual learning and it is unclear what underlies these differences. The present study investigates whether perceptual learning in speech relates to statistical learning, as sensitivity to probabilistic information may aid identification of relevant cues in novel speech input. If statistical learning and perceptual learning (partly) draw on the same general mechanisms, then statistical learning in a non-auditory modality using non-linguistic sequences should predict adaptation to degraded speech. In the present study, 73 older adults (aged over 60 years) and 60 younger adults (aged between 18 and 30 years) performed a visual artificial grammar learning task and were presented with 60 meaningful noise-vocoded sentences in an auditory recall task. Within age groups, sentence recognition performance over exposure was analyzed as a function of statistical learning performance, and other variables that may predict learning (i.e., hearing, vocabulary, attention switching control, working memory, and processing speed). Younger and older adults showed similar amounts of perceptual learning, but only younger adults showed significant statistical learning. In older adults, improvement in understanding noise-vocoded speech was constrained by age. In younger adults, amount of adaptation was associated with lexical knowledge and with statistical learning ability. Thus, individual differences in general cognitive abilities explain listeners' variability in adapting to noise-vocoded speech. Results suggest that perceptual and statistical learning share mechanisms of implicit regularity detection, but that the ability to detect statistical regularities is impaired in older adults if visual sequences are presented quickly.

  15. Contamination of different portions of raw and boiled specimens of Norway lobster by mercury and selenium.

    PubMed

    Perugini, Monia; Visciano, Pierina; Manera, Maurizio; Abete, Maria Cesarina; Gavinelli, Stefania; Amorena, Michele

    2013-11-01

    The aim of this study was to evaluate mercury and selenium distribution in different portions (exoskeleton, white meat and brown meat) of Norway lobster (Nephrops norvegicus). Some samples were also analysed as whole specimens. The same portions were also examined after boiling, in order to observe if this cooking practice could affect mercury and selenium concentrations. The highest mercury concentrations were detected in white meat, exceeding in all cases the maximum levels established by European legislation. The brown meat reported the highest selenium concentrations. In all boiled samples, mercury levels showed a statistically significant increase compared to raw portions. On the contrary, selenium concentrations detected in boiled samples of white meat, brown meat and whole specimen showed a statistically significant decrease compared to the corresponding raw samples. These results indicate that boiling modifies mercury and selenium concentrations. The high mercury levels detected represent a possible risk for consumers, and the publication and diffusion of specific advisories concerning seafood consumption is recommended.

  16. Ozone Air Quality over North America: Part II-An Analysis of Trend Detection and Attribution Techniques.

    PubMed

    Porter, P Steven; Rao, S Trivikrama; Zurbenko, Igor G; Dunker, Alan M; Wolff, George T

    2001-02-01

    Assessment of regulatory programs aimed at improving ambient O 3 air quality is of considerable interest to the scientific community and to policymakers. Trend detection, the identification of statistically significant long-term changes, and attribution, linking change to specific clima-tological and anthropogenic forcings, are instrumental to this assessment. Detection and attribution are difficult because changes in pollutant concentrations of interest to policymakers may be much smaller than natural variations due to weather and climate. In addition, there are considerable differences in reported trends seemingly based on similar statistical methods and databases. Differences arise from the variety of techniques used to reduce nontrend variation in time series, including mitigating the effects of meteorology and the variety of metrics used to track changes. In this paper, we review the trend assessment techniques being used in the air pollution field and discuss their strengths and limitations in discerning and attributing changes in O 3 to emission control policies.

  17. Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics

    NASA Astrophysics Data System (ADS)

    Eamer, Jordan B. R.; Walker, Ian J.

    2013-06-01

    Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee, despite erosion on the stoss slope and dune toe. Generally, the foredune became wider by landward extension and the seaward slope recovered from erosion to a similar height and form to that of pre-restoration despite remaining essentially free of vegetation.

  18. Adaptive Statistical Iterative Reconstruction-Applied Ultra-Low-Dose CT with Radiography-Comparable Radiation Dose: Usefulness for Lung Nodule Detection.

    PubMed

    Yoon, Hyun Jung; Chung, Myung Jin; Hwang, Hye Sun; Moon, Jung Won; Lee, Kyung Soo

    2015-01-01

    To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, < 0.01, 0.038, and < 0.868 for observers 1, 2, 3, 4, and 5, respectively). The sensitivity of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT in three out of the five observers (p < 0.01 for three observers, and p = 0.064 and 0.146 for two observers). In jackknife alternative free-response receiver operating characteristic analysis, the mean values of figure-of-merit (FOM) for FBP, ASIR-driven ULDCT, and SCT were 0.682, 0.772, and 0.821, respectively, and there were no significant differences in FOM values between ASIR-driven ULDCT and SCT (p = 0.11), but the FOM value of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT and SCT (p = 0.01 and 0.00). Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT.

  19. Adaptive Statistical Iterative Reconstruction-Applied Ultra-Low-Dose CT with Radiography-Comparable Radiation Dose: Usefulness for Lung Nodule Detection

    PubMed Central

    Yoon, Hyun Jung; Hwang, Hye Sun; Moon, Jung Won; Lee, Kyung Soo

    2015-01-01

    Objective To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Materials and Methods Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Results Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, < 0.01, 0.038, and < 0.868 for observers 1, 2, 3, 4, and 5, respectively). The sensitivity of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT in three out of the five observers (p < 0.01 for three observers, and p = 0.064 and 0.146 for two observers). In jackknife alternative free-response receiver operating characteristic analysis, the mean values of figure-of-merit (FOM) for FBP, ASIR-driven ULDCT, and SCT were 0.682, 0.772, and 0.821, respectively, and there were no significant differences in FOM values between ASIR-driven ULDCT and SCT (p = 0.11), but the FOM value of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT and SCT (p = 0.01 and 0.00). Conclusion Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT. PMID:26357505

  20. Power analysis and trend detection for water quality monitoring data. An application for the Greater Yellowstone Inventory and Monitoring Network

    USGS Publications Warehouse

    Irvine, Kathryn M.; Manlove, Kezia; Hollimon, Cynthia

    2012-01-01

    An important consideration for long term monitoring programs is determining the required sampling effort to detect trends in specific ecological indicators of interest. To enhance the Greater Yellowstone Inventory and Monitoring Network’s water resources protocol(s) (O’Ney 2006 and O’Ney et al. 2009 [under review]), we developed a set of tools to: (1) determine the statistical power for detecting trends of varying magnitude in a specified water quality parameter over different lengths of sampling (years) and different within-year collection frequencies (monthly or seasonal sampling) at particular locations using historical data, and (2) perform periodic trend analyses for water quality parameters while addressing seasonality and flow weighting. A power analysis for trend detection is a statistical procedure used to estimate the probability of rejecting the hypothesis of no trend when in fact there is a trend, within a specific modeling framework. In this report, we base our power estimates on using the seasonal Kendall test (Helsel and Hirsch 2002) for detecting trend in water quality parameters measured at fixed locations over multiple years. We also present procedures (R-scripts) for conducting a periodic trend analysis using the seasonal Kendall test with and without flow adjustment. This report provides the R-scripts developed for power and trend analysis, tutorials, and the associated tables and graphs. The purpose of this report is to provide practical information for monitoring network staff on how to use these statistical tools for water quality monitoring data sets.

  1. A powerful score-based test statistic for detecting gene-gene co-association.

    PubMed

    Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun

    2016-01-29

    The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.

  2. On the Statistical Properties of Cospectra

    NASA Astrophysics Data System (ADS)

    Huppenkothen, D.; Bachetti, M.

    2018-05-01

    In recent years, the cross-spectrum has received considerable attention as a means of characterizing the variability of astronomical sources as a function of wavelength. The cospectrum has only recently been understood as a means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different timescales. In this paper, we lay out the statistical foundations of the cospectrum, starting with the simplest case of detecting a periodic signal in the presence of white noise, under the assumption that the same source is observed simultaneously in independent detectors in the same energy range. This case is especially relevant for detecting faint X-ray pulsars in detectors heavily affected by instrumental effects, including NuSTAR, Astrosat, and IXPE, which allow for even sampling and where the cospectrum can act as an effective way to mitigate dead time. We show that the statistical distributions of both single and averaged cospectra differ considerably from those for standard periodograms. While a single cospectrum follows a Laplace distribution exactly, averaged cospectra are approximated by a Gaussian distribution only for more than ∼30 averaged segments, dependent on the number of trials. We provide an instructive example of a quasi-periodic oscillation in NuSTAR and show that applying standard periodogram statistics leads to underestimated tail probabilities for period detection. We also demonstrate the application of these distributions to a NuSTAR observation of the X-ray pulsar Hercules X-1.

  3. Which Statistic Should Be Used to Detect Item Preknowledge When the Set of Compromised Items Is Known?

    PubMed

    Sinharay, Sandip

    2017-09-01

    Benefiting from item preknowledge is a major type of fraudulent behavior during educational assessments. Belov suggested the posterior shift statistic for detection of item preknowledge and showed its performance to be better on average than that of seven other statistics for detection of item preknowledge for a known set of compromised items. Sinharay suggested a statistic based on the likelihood ratio test for detection of item preknowledge; the advantage of the statistic is that its null distribution is known. Results from simulated and real data and adaptive and nonadaptive tests are used to demonstrate that the Type I error rate and power of the statistic based on the likelihood ratio test are very similar to those of the posterior shift statistic. Thus, the statistic based on the likelihood ratio test appears promising in detecting item preknowledge when the set of compromised items is known.

  4. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    PubMed

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  5. Statistical Transformation and the Interpretation of Inpatient Glucose Control Data From the Intensive Care Unit

    PubMed Central

    Saulnier, George E.; Castro, Janna C.

    2014-01-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box–Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. PMID:24876620

  6. Arsenic contamination of drinking water in Ireland: A spatial analysis of occurrence and potential risk.

    PubMed

    McGrory, Ellen R; Brown, Colin; Bargary, Norma; Williams, Natalya Hunter; Mannix, Anthony; Zhang, Chaosheng; Henry, Tiernan; Daly, Eve; Nicholas, Sarah; Petrunic, Barbara M; Lee, Monica; Morrison, Liam

    2017-02-01

    The presence of arsenic in groundwater has become a global concern due to the health risks from drinking water with elevated concentrations. The Water Framework Directive (WFD) of the European Union calls for drinking water risk assessment for member states. The present study amalgamates readily available national and sub-national scale datasets on arsenic in groundwater in the Republic of Ireland. However, due to the presence of high levels of left censoring (i.e. arsenic values below an analytical detection limit) and changes in detection limits over time, the application of conventional statistical methods would inhibit the generation of meaningful results. In order to handle these issues several arsenic databases were integrated and the data modelled using statistical methods appropriate for non-detect data. In addition, geostatistical methods were used to assess principal risk components of elevated arsenic related to lithology, aquifer type and groundwater vulnerability. Geographic statistical methods were used to overcome some of the geographical limitations of the Irish Environmental Protection Agency (EPA) sample database. Nearest-neighbour inverse distance weighting (IDW) and local indicator of spatial association (LISA) methods were used to estimate risk in non-sampled areas. Significant differences were also noted between different aquifer lithologies, indicating that Rhyolite, Sandstone and Shale (Greywackes), and Impure Limestone potentially presented a greater risk of elevated arsenic in groundwaters. Significant differences also occurred among aquifer types with poorly productive aquifers, locally important fractured bedrock aquifers and regionally important fissured bedrock aquifers presenting the highest potential risk of elevated arsenic. No significant differences were detected among different groundwater vulnerability groups as defined by the Geological Survey of Ireland. This research will assist management and future policy directions of groundwater resources at EU level and guide future research focused on understanding arsenic mobilisation processes to facilitate in guiding future development, testing and treatment requirements of groundwater resources. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Acoustic firearm discharge detection and classification in an enclosed environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luzi, Lorenzo; Gonzalez, Eric; Bruillard, Paul

    2016-05-01

    Two different signal processing algorithms are described for detection and classification of acoustic signals generated by firearm discharges in small enclosed spaces. The first is based on the logarithm of the signal energy. The second is a joint entropy. The current study indicates that a system using both signal energy and joint entropy would be able to both detect weapon discharges and classify weapon type, in small spaces, with high statistical certainty.

  8. Optimum Array Processing for Detecting Binary Signals Corrupted by Directional Interference.

    DTIC Science & Technology

    1972-12-01

    specific cases. Two different series representations of a vector random process are discussed in Van Trees [3]. These two methods both require the... spaci ~ng d, etc.) its detection error represents a lower bound for the performance that might be obtained with other types of array processing (such...Middleton, Introduction to Statistical Communication Theory, New York: McGraw-Hill, 1960. 3. H.L. Van Trees , Detection, Estimation, and Modulation Theory

  9. Systematic evaluation of serum and plasma collection on the endogenous metabolome.

    PubMed

    Zhou, Zhi; Chen, Yanhua; He, Jiuming; Xu, Jing; Zhang, Ruiping; Mao, Yan; Abliz, Zeper

    2017-02-01

    In metabolomics research, the use of different blood collection methods may influence endogenous metabolites. Ultra HPLC coupled with MS/MS was applied together with multivariate statistics to investigate metabolomics differences in serum and plasma samples handled by different anticoagulants. A total of 135 known representative metabolites were assessed for comprehensive evaluation of the effects of anticoagulants. Exogenous factors, including separation gel ingredients from the serum collection tubes and the anticoagulants, affected mass spectrometer detection. Heparin plasma yielded the best detection of different functional groups and is therefore the optimal blood specimen for metabolomics research, followed by potassium oxalate plasma.

  10. Using lod scores to detect sex differences in male-female recombination fractions.

    PubMed

    Feenstra, B; Greenberg, D A; Hodge, S E

    2004-01-01

    Human recombination fraction (RF) can differ between males and females, but investigators do not always know which disease genes are located in genomic areas of large RF sex differences. Knowledge of RF sex differences contributes to our understanding of basic biology and can increase the power of a linkage study, improve gene localization, and provide clues to possible imprinting. One way to detect these differences is to use lod scores. In this study we focused on detecting RF sex differences and answered the following questions, in both phase-known and phase-unknown matings: (1) How large a sample size is needed to detect a RF sex difference? (2) What are "optimal" proportions of paternally vs. maternally informative matings? (3) Does ascertaining nonoptimal proportions of paternally or maternally informative matings lead to ascertainment bias? Our results were as follows: (1) We calculated expected lod scores (ELODs) under two different conditions: "unconstrained," allowing sex-specific RF parameters (theta(female), theta(male)); and "constrained," requiring theta(female) = theta(male). We then examined the DeltaELOD (identical with difference between maximized constrained and unconstrained ELODs) and calculated minimum sample sizes required to achieve statistically significant DeltaELODs. For large RF sex differences, samples as small as 10 to 20 fully informative matings can achieve statistical significance. We give general sample size guidelines for detecting RF differences in informative phase-known and phase-unknown matings. (2) We defined p as the proportion of paternally informative matings in the dataset; and the optimal proportion p(circ) as that value of p that maximizes DeltaELOD. We determined that, surprisingly, p(circ) does not necessarily equal (1/2), although it does fall between approximately 0.4 and 0.6 in most situations. (3) We showed that if p in a sample deviates from its optimal value, no bias is introduced (asymptotically) to the maximum likelihood estimates of theta(female) and theta(male), even though ELOD is reduced (see point 2). This fact is important because often investigators cannot control the proportions of paternally and maternally informative families. In conclusion, it is possible to reliably detect sex differences in recombination fraction. Copyright 2004 S. Karger AG, Basel

  11. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  12. Kidney function endpoints in kidney transplant trials: a struggle for power.

    PubMed

    Ibrahim, A; Garg, A X; Knoll, G A; Akbari, A; White, C A

    2013-03-01

    Kidney function endpoints are commonly used in randomized controlled trials (RCTs) in kidney transplantation (KTx). We conducted this study to estimate the proportion of ongoing RCTs with kidney function endpoints in KTx where the proposed sample size is large enough to detect meaningful differences in glomerular filtration rate (GFR) with adequate statistical power. RCTs were retrieved using the key word "kidney transplantation" from the National Institute of Health online clinical trial registry. Included trials had at least one measure of kidney function tracked for at least 1 month after transplant. We determined the proportion of two-arm parallel trials that had sufficient sample sizes to detect a minimum 5, 7.5 and 10 mL/min difference in GFR between arms. Fifty RCTs met inclusion criteria. Only 7% of the trials were above a sample size of 562, the number needed to detect a minimum 5 mL/min difference between the groups should one exist (assumptions: α = 0.05; power = 80%, 10% loss to follow-up, common standard deviation of 20 mL/min). The result increased modestly to 36% of trials when a minimum 10 mL/min difference was considered. Only a minority of ongoing trials have adequate statistical power to detect between-group differences in kidney function using conventional sample size estimating parameters. For this reason, some potentially effective interventions which ultimately could benefit patients may be abandoned from future assessment. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.

  13. Infant Statistical Learning

    PubMed Central

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  14. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  15. New methods in iris recognition.

    PubMed

    Daugman, John

    2007-10-01

    This paper presents the following four advances in iris recognition: 1) more disciplined methods for detecting and faithfully modeling the iris inner and outer boundaries with active contours, leading to more flexible embedded coordinate systems; 2) Fourier-based methods for solving problems in iris trigonometry and projective geometry, allowing off-axis gaze to be handled by detecting it and "rotating" the eye into orthographic perspective; 3) statistical inference methods for detecting and excluding eyelashes; and 4) exploration of score normalizations, depending on the amount of iris data that is available in images and the required scale of database search. Statistical results are presented based on 200 billion iris cross-comparisons that were generated from 632500 irises in the United Arab Emirates database to analyze the normalization issues raised in different regions of receiver operating characteristic curves.

  16. Intraoperative detection of 18F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria

    PubMed Central

    2013-01-01

    Background Intraoperative detection of 18F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of 18F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Methods Of 58 patients undergoing 18F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine 18F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each 18F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. Results The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2–15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0–2.1) and 1.0 (± 0, range 1.0–1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Conclusions Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical threshold criteria method can allow for improved detection of 18F-FDG-avid tissue sites when a low in situ T/B ratio is encountered. PMID:23496877

  17. Study on chemotherapeutic sensitizing effect of nimotuzumab on different human esophageal squamous carcinoma cells.

    PubMed

    Yang, Xiaoyu; Ji, Yinghua; Kang, Xiaochun; Chen, Meiling; Kou, Weizheng; Jin, Cailing; Lu, Ping

    2016-02-01

    Esophageal cancer is one of the leading causes of mortality worldwide. Although, surgery, radio- and chemotherapy are used to treat the disease, the identification of new drugs is crucial to increase the curative effect. The aim of the present study was to examine the chemotherapeutic sensitizing effect of nimotuzumab (h-R3) and cisplatin cytotoxic drugs cisplatin (DDP) and 5-fluorouracil (5-FU) on esophageal carcinoma cells with two different epidermal growth factor receptor (EGFR) expressions. The expression of EGFR was detected in the human EC1 or EC9706 esophageal squamous cell carcinoma cell line using immunohistochemistry. The inhibitory effect of DDP and 5-FU alone or combined with h-R3 on EC1 or EC9706 cell proliferation was detected using an MTT assay. Flow cytometry and the TUNEL assay were used to determine the effect of single or combined drug treatment on cell apoptosis. The results showed that the expression of EGFR was low in EC1 cells but high in EC9706 cells. The inhibitory effect of the single use of h-R3 on EC1 or EC9706 cell proliferation was decreased. The inhibitory effect between single use of h-R3 alone and combined use of the chemotherapy drugs showed no statistically significant difference (P>0.05) on the EC1 cell growth rate, but showed a statistically significant difference (a=0.05) on EC9706 cell growth rate. The results detected by flow cytometry and TUNEL assay showed that the difference between single use of h-R3 alone and the control group was statistically significant with regard to the EC1 apoptosis rate effect (P<0.05), but not statistically significant for EC9706 (P>0.05). However, statistically significant differences were identified in the apoptotic rate of EC9706 cells between the h-R3 combined chemotherapy group and single chemotherapy group (P<0.05), but not on in the EC1 chemotherapy group (P>0.05). In conclusion, the sensitization effect of h-R3 on chemotherapy drugs is associated with the expression level of EGFR in EC1 or EC9706 cells. The cell killing effect of the combined use of h-R3 with DDP and 5-FU showed no obvious synergistic effect compared to the single-drug group, but only an additive effect.

  18. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  19. [Hydrologic variability and sensitivity based on Hurst coefficient and Bartels statistic].

    PubMed

    Lei, Xu; Xie, Ping; Wu, Zi Yi; Sang, Yan Fang; Zhao, Jiang Yan; Li, Bin Bin

    2018-04-01

    Due to the global climate change and frequent human activities in recent years, the pure stochastic components of hydrological sequence is mixed with one or several of the variation ingredients, including jump, trend, period and dependency. It is urgently needed to clarify which indices should be used to quantify the degree of their variability. In this study, we defined the hydrological variability based on Hurst coefficient and Bartels statistic, and used Monte Carlo statistical tests to test and analyze their sensitivity to different variants. When the hydrological sequence had jump or trend variation, both Hurst coefficient and Bartels statistic could reflect the variation, with the Hurst coefficient being more sensitive to weak jump or trend variation. When the sequence had period, only the Bartels statistic could detect the mutation of the sequence. When the sequence had a dependency, both the Hurst coefficient and the Bartels statistics could reflect the variation, with the latter could detect weaker dependent variations. For the four variations, both the Hurst variability and Bartels variability increased with the increases of variation range. Thus, they could be used to measure the variation intensity of the hydrological sequence. We analyzed the temperature series of different weather stations in the Lancang River basin. Results showed that the temperature of all stations showed the upward trend or jump, indicating that the entire basin had experienced warming in recent years and the temperature variability in the upper and lower reaches was much higher. This case study showed the practicability of the proposed method.

  20. Increasing the statistical significance of entanglement detection in experiments.

    PubMed

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  1. Using non-specialist observers in 4AFC human observer studies

    NASA Astrophysics Data System (ADS)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R.; Young, Kenneth C.; Wells, Kevin

    2017-03-01

    Virtual clinical trials (VCTs) are an emergent approach for rapid evaluation and comparison of various breast imaging technologies and techniques using computer-based modeling tools. Increasingly 4AFC (Four alternative forced choice) virtual clinical trials are used to compare detection performances of different breast imaging modalities. Most prior studies have used physicists and/or radiologists and physicists interchangeably. However, large scale use of statistically significant 4AFC observer studies is challenged by the individual time commitment and cost of such observers, often drawn from a limited local pool of specialists. This work aims to investigate whether non-specialist observers can be used to supplement such studies. A team of five specialist observers (medical physicists) and five non-specialists participated in a 4AFC study containing simulated 2D-mammography and DBT (digital breast tomosynthesis) images, produced using the OPTIMAM toolbox for VCTs. The images contained 4mm irregular solid masses and 4mm spherical targets at a range of contrast levels embedded in a realistic breast phantom background. There was no statistically significant difference between the detection performance of medical physicists and non-specialists (p>0.05). However, non-specialists took longer to complete the study than their physicist counterparts, which was statistically significant (p<0.05). Overall, the results from both observer groups indicate that DBT has a lower detectable threshold contrast than 2D-mammography for both masses and spheres, and both groups found spheres easier to detect than irregular solid masses.

  2. Statistical Analyses of Brain Surfaces Using Gaussian Random Fields on 2-D Manifolds

    PubMed Central

    Staib, Lawrence H.; Xu, Dongrong; Zhu, Hongtu; Peterson, Bradley S.

    2008-01-01

    Interest in the morphometric analysis of the brain and its subregions has recently intensified because growth or degeneration of the brain in health or illness affects not only the volume but also the shape of cortical and subcortical brain regions, and new image processing techniques permit detection of small and highly localized perturbations in shape or localized volume, with remarkable precision. An appropriate statistical representation of the shape of a brain region is essential, however, for detecting, localizing, and interpreting variability in its surface contour and for identifying differences in volume of the underlying tissue that produce that variability across individuals and groups of individuals. Our statistical representation of the shape of a brain region is defined by a reference region for that region and by a Gaussian random field (GRF) that is defined across the entire surface of the region. We first select a reference region from a set of segmented brain images of healthy individuals. The GRF is then estimated as the signed Euclidean distances between points on the surface of the reference region and the corresponding points on the corresponding region in images of brains that have been coregistered to the reference. Correspondences between points on these surfaces are defined through deformations of each region of a brain into the coordinate space of the reference region using the principles of fluid dynamics. The warped, coregistered region of each subject is then unwarped into its native space, simultaneously bringing into that space the map of corresponding points that was established when the surfaces of the subject and reference regions were tightly coregistered. The proposed statistical description of the shape of surface contours makes no assumptions, other than smoothness, about the shape of the region or its GRF. The description also allows for the detection and localization of statistically significant differences in the shapes of the surfaces across groups of subjects at both a fine and coarse scale. We demonstrate the effectiveness of these statistical methods by applying them to study differences in shape of the amygdala and hippocampus in a large sample of normal subjects and in subjects with attention deficit/hyperactivity disorder (ADHD). PMID:17243583

  3. Reliability of the Watch-PAT 200 in Detecting Sleep Apnea in Highway Bus Drivers

    PubMed Central

    Yuceege, Melike; Firat, Hikmet; Demir, Ahmet; Ardic, Sadik

    2013-01-01

    Objective: To predict the validity of Watch-PAT (WP) device for sleep disordered breathing (SDB) among highway bus drivers. Method: A total number of 90 highway bus drivers have undergone polysomnography (PSG) and Watch-PAT test simultaneously. Routine blood tests and the routine ear-nose-throat (ENT) exams have been done as well. Results: The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 89.1%, 76.9%, 82% and 85.7% for RDI > 15, respectively. WRDI, WODI, W < 90% duration and Wmean SaO2 results were well correlated with the PSG results. In the sensitivity and specificity analysis, when diagnosis of sleep apnea was defined for different cut-off values of RDI of 5, 10 and 15, AUC (95%CI) were found as 0.84 (0.74-0.93), 0.87 (95%CI: 0.79-0.94) and 0.91 (95%CI: 0.85-0.97), respectively. There were no statistically significant differences between Stage1+2/Wlight and Stage REM/WREM. The percentage of Stage 3 sleep had difference significant statistically from the percentage of Wdeep. Total sleep times in PSG and WP showed no statistically important difference. Total NREM duration and total WNREM duration had no difference either. Conclusion: Watch-PAT device is helpful in detecting SDB with RDI > 15 in highway bus drivers, especially in drivers older than 45 years, but has limited value in drivers younger than 45 years old who have less risk for OSA. Therefore, WP can be used in the former group when PSG is not easily available. Citation: Yuceege M; Firat F; Demir A; Ardic S. Reliability of the Watch-PAT 200 in detecting sleep apnea in highway bus drivers. J Clin Sleep Med 2013;9(4):339-344. PMID:23585749

  4. Data analysis of gravitational-wave signals from spinning neutron stars. III. Detection statistics and computational requirements

    NASA Astrophysics Data System (ADS)

    Jaranowski, Piotr; Królak, Andrzej

    2000-03-01

    We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.

  5. The effect of a graphical interpretation of a statistic trend indicator (Trigg's Tracking Variable) on the detection of simulated changes.

    PubMed

    Kennedy, R R; Merry, A F

    2011-09-01

    Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.

  6. Implication of relationship between natural impacts and land use/land cover (LULC) changes of urban area in Mongolia

    NASA Astrophysics Data System (ADS)

    Gantumur, Byambakhuu; Wu, Falin; Zhao, Yan; Vandansambuu, Battsengel; Dalaibaatar, Enkhjargal; Itiritiphan, Fareda; Shaimurat, Dauryenbyek

    2017-10-01

    Urban growth can profoundly alter the urban landscape structure, ecosystem processes, and local climates. Timely and accurate information on the status and trends of urban ecosystems is critical to develop strategies for sustainable development and to improve the urban residential environment and living quality. Ulaanbaatar city was urbanized very rapidly caused by herders and farmers, many of them migrating from rural places, have played a big role in this urban expansion (sprawl). Today, 1.3 million residents for about 40% of total population are living in the Ulaanbaatar region. Those human activities influenced stronger to green environments. Therefore, the aim of this study is determined to change detection of land use/land cover (LULC) and estimating their areas for the trend of future by remote sensing and statistical methods. The implications of analysis were provided by change detection methods of LULC, remote sensing spectral indices including normalized difference vegetation index (NDVI), normalized difference water index (NDWI) and normalized difference built-up index (NDBI). In addition, it can relate to urban heat island (UHI) provided by Land surface temperature (LST) with local climate issues. Statistical methods for image processing used to define relations between those spectral indices and change detection images and regression analysis for time series trend in future. Remote sensing data are used by Landsat (TM/ETM+/OLI) satellite images over the period between 1990 and 2016 by 5 years. The advantages of this study are very useful remote sensing approaches with statistical analysis and important to detecting changes of LULC. The experimental results show that the LULC changes can image on the present and after few years and determined relations between impacts of environmental conditions.

  7. Identification of Differentially Methylated Sites with Weak Methylation Effects

    PubMed Central

    Tran, Hong; Zhu, Hongxiao; Wu, Xiaowei; Kim, Gunjune; Clarke, Christopher R.; Larose, Hailey; Haak, David C.; Westwood, James H.; Zhang, Liqing

    2018-01-01

    Deoxyribonucleic acid (DNA) methylation is an epigenetic alteration crucial for regulating stress responses. Identifying large-scale DNA methylation at single nucleotide resolution is made possible by whole genome bisulfite sequencing. An essential task following the generation of bisulfite sequencing data is to detect differentially methylated cytosines (DMCs) among treatments. Most statistical methods for DMC detection do not consider the dependency of methylation patterns across the genome, thus possibly inflating type I error. Furthermore, small sample sizes and weak methylation effects among different phenotype categories make it difficult for these statistical methods to accurately detect DMCs. To address these issues, the wavelet-based functional mixed model (WFMM) was introduced to detect DMCs. To further examine the performance of WFMM in detecting weak differential methylation events, we used both simulated and empirical data and compare WFMM performance to a popular DMC detection tool methylKit. Analyses of simulated data that replicated the effects of the herbicide glyphosate on DNA methylation in Arabidopsis thaliana show that WFMM results in higher sensitivity and specificity in detecting DMCs compared to methylKit, especially when the methylation differences among phenotype groups are small. Moreover, the performance of WFMM is robust with respect to small sample sizes, making it particularly attractive considering the current high costs of bisulfite sequencing. Analysis of empirical Arabidopsis thaliana data under varying glyphosate dosages, and the analysis of monozygotic (MZ) twins who have different pain sensitivities—both datasets have weak methylation effects of <1%—show that WFMM can identify more relevant DMCs related to the phenotype of interest than methylKit. Differentially methylated regions (DMRs) are genomic regions with different DNA methylation status across biological samples. DMRs and DMCs are essentially the same concepts, with the only difference being how methylation information across the genome is summarized. If methylation levels are determined by grouping neighboring cytosine sites, then they are DMRs; if methylation levels are calculated based on single cytosines, they are DMCs. PMID:29419727

  8. [Efficacy and safety of ultrasound-guided or neurostimulator-guided bilateral axillary brachial plexus block].

    PubMed

    Xu, C S; Zhao, X L; Zhou, H B; Qu, Z J; Yang, Q G; Wang, H J; Wang, G

    2017-10-17

    Objective: To explore the efficacy and safety of bilateral axillary brachial plexus block under the guidance of ultrasound or neurostimulator. Methods: From February 2012 to April 2014, 120 patients undergoing bilateral hand/forearm surgery in Beijing Jishuitan Hospital were enrolled and anaesthetized with bilateral axillary brachial plexus block. All patients were divided into two groups randomly using random number table: the ultrasound-guided group (group U, n =60) and the neurostimulator-guidedgroup (group N, n =60). The block was performed with 0.5% ropivacaine. Patients' age, sex and operation duration were recorded. Moreover, success rate, performance time, onset of sensor and motor block, performance pain, patient satisfaction degree and the incidence of related complications were also documented. Venous samples were collected at selected time points and the total and the plasma concentrations of ropivacaine were analyzed with HPLC. Results: The performance time, the onset of sensor block and the onset of motor block of group U were (8.2±1.5), (14.2± 2.2)and (24.0±3.5)min respectively, which were markedly shorter than those in group N( (14.6±3.9), (19.9±3.8), (28.8±4.2)min, respectively), and the differences were statistically significant( t =11.74, 10.09, 6.73, respectively, all P <0.01). The performance pain score of group N was (25.5± 13.2), which was obviously more serious than group U (31.7± 11.2) and a significant statistical difference was detected ( t =2.856, P <0.05). The patient satisfaction degree of group U was 95.0%, which was significantly higher than group N (83.3%) and a markedly statistical difference was detected (χ(2)=4.227, P <0.05). Fifty min after performance, the total plasma concentration of ropivacaine of group U was(1.76±0.48)mg/L, which was significantly lower than group N (1.88±0.53)mg/L and a significant statistical difference was detected ( t =2.43, P <0.05), while no significant differences were detected at the other time points between two groups ( P >0.05). No analgesic was superadded and no other anesthesia methods were applied. No complications were detected perioperatively. Conclusions: The bilateral axillary brachial plexus block under the guidance of ultrasound or neurostimulator are both effective and safe for bilateral hand/forearm surgery. However, the ultrasound-guided block may be more clinically beneficial because of its shorter performance time, rapid onset and higher patient satisfaction degree.

  9. Determination of quality parameters from statistical analysis of routine TLD dosimetry data.

    PubMed

    German, U; Weinstein, M; Pelled, O

    2006-01-01

    Following the as low as reasonably achievable (ALARA) practice, there is a need to measure very low doses, of the same order of magnitude as the natural background, and the limits of detection of the dosimetry systems. The different contributions of the background signals to the total zero dose reading of thermoluminescence dosemeter (TLD) cards were analysed by using the common basic definitions of statistical indicators: the critical level (L(C)), the detection limit (L(D)) and the determination limit (L(Q)). These key statistical parameters for the system operated at NRC-Negev were quantified, based on the history of readings of the calibration cards in use. The electronic noise seems to play a minor role, but the reading of the Teflon coating (without the presence of a TLD crystal) gave a significant contribution.

  10. [Accuracy of computer aided measurement for detecting dental proximal caries lesions in images of cone-beam computed tomography].

    PubMed

    Zhang, Z L; Li, J P; Li, G; Ma, X C

    2017-02-09

    Objective: To establish and validate a computer program used to aid the detection of dental proximal caries in the images cone beam computed tomography (CBCT) images. Methods: According to the characteristics of caries lesions in X-ray images, a computer aided detection program for proximal caries was established with Matlab and Visual C++. The whole process for caries lesion detection included image import and preprocessing, measuring average gray value of air area, choosing region of interest and calculating gray value, defining the caries areas. The program was used to examine 90 proximal surfaces from 45 extracted human teeth collected from Peking University School and Hospital of Stomatology. The teeth were then scanned with a CBCT scanner (Promax 3D). The proximal surfaces of the teeth were respectively detected by caries detection program and scored by human observer for the extent of lesions with 6-level-scale. With histologic examination serving as the reference standard, the caries detection program and the human observer performances were assessed with receiver operating characteristic (ROC) curves. Student t -test was used to analyze the areas under the ROC curves (AUC) for the differences between caries detection program and human observer. Spearman correlation coefficient was used to analyze the detection accuracy of caries depth. Results: For the diagnosis of proximal caries in CBCT images, the AUC values of human observers and caries detection program were 0.632 and 0.703, respectively. There was a statistically significant difference between the AUC values ( P= 0.023). The correlation between program performance and gold standard (correlation coefficient r (s)=0.525) was higher than that of observer performance and gold standard ( r (s)=0.457) and there was a statistically significant difference between the correlation coefficients ( P= 0.000). Conclusions: The program that automatically detects dental proximal caries lesions could improve the diagnostic value of CBCT images.

  11. Coronal Holes and Solar f -Mode Wave Scattering Off Linear Boundaries

    NASA Astrophysics Data System (ADS)

    Hess Webber, Shea A.

    2016-11-01

    Coronal holes (CHs) are solar atmospheric features that have reduced emission in the extreme ultraviolet (EUV) spectrum due to decreased plasma density along open magnetic field lines. CHs are the source of the fast solar wind, can influence other solar activity, and track the solar cycle. Our interest in them deals with boundary detection near the solar surface. Detecting CH boundaries is important for estimating their size and tracking their evolution through time, as well as for comparing the physical properties within and outside of the feature. In this thesis, we (1) investigate CHs using statistical properties and image processing techniques on EUV images to detect CH boundaries in the low corona and chromosphere. SOHO/EIT data is used to locate polar CH boundaries on the solar limb, which are then tracked through two solar cycles. Additionally, we develop an edge-detection algorithm that we use on SDO/AIA data of a polar hole extension with an approximately linear boundary. These locations are used later to inform part of the helioseismic investigation; (2) develop a local time-distance (TD) helioseismology technique that can be used to detect CH boundary signatures at the photospheric level. We employ a new averaging scheme that makes use of the quasi-linear topology of elongated scattering regions, and create simulated data to test the new technique and compare results of some associated assumptions. This method enhances the wave propagation signal in the direction perpendicular to the linear feature and reduces the computational time of the TD analysis. We also apply a new statistical analysis of the significance of differences between the TD results; and (3) apply the TD techniques to solar CH data from SDO/HMI. The data correspond to the AIA data used in the edge-detection algorithm on EUV images. We look for statistically significant differences between the TD results inside and outside the CH region. In investigation (1), we found that the polar CH areas did not change significantly between minima, even though the magnetic field strength weakened. The results of (2) indicate that TD helioseismology techniques can be extended to make use of feature symmetry in the domain. The linear technique used here produces results that differ between a linear scattering region and a circular scattering region, shown using the simulated data algorithm. This suggests that using usual TD methods on scattering regions that are radially asymmetric may produce results with signatures of the anisotropy. The results of (1) and (3) indicate that the TD signal within our CH is statistically significantly different compared to unrelated quiet sun results. Surprisingly, the TD results in the quiet sun near the CH boundary also show significant differences compared to the separate quiet sun.

  12. [Investigation of norovirus infection incidence among 0-5 years old children with acute gastroenteritis admitted to two different hospitals in ankara, Turkey].

    PubMed

    Altay, Aylin; Bozdayı, Gülendam; Meral, Melda; Dallar Bilge, Yıldız; Dalgıç, Buket; Ozkan, Seçil; Ahmed, Kamruddin

    2013-01-01

    Norovirus causes severe gastroenteritis requiring hospitalization especially in children less than five years of age both in developed and developing countries. Therefore, we aimed to investigate the incidence of norovirus (NoV) in 0-5 years old children with acute gastroenteritis in two large hospitals in Ankara, Turkey. Stool samples were obtained from 1000 (413 female, 587 male) children between 0-5 years old with acute gastroenteritis who attended to the Department of Paediatrics, Ministry of Health Ankara Training and Education Hospital and affiliated hospital of Gazi University Faculty of Medicine between October 2004 and June 2011. Antigens of norovirus GI and GII genogroups in the stool specimens were detected by ELISA (RIDASCREEN® Norovirus (C1401) 3rd Generation, R-Biopharm, Germany). Norovirus GI and GII antigens were determined in a total of 141 (14.1%) samples, of them 62 (15%) were female and 79 (13.5%) were male, yielding no statistically significant difference (p> 0.05). The highest NoV positivity was detected in children between 12-23 months (17.1%), however there was no statistically significant difference between ELISA positivity and age (p> 0.05). NoV detection rate was highest in 2007 (18.4%) and in 2009 (18%), and the difference regarding ELISA positivity among the study years was not statistically significant (p> 0.05). The prevalences of norovirus infection in spring, summer, autumn and winter were 13.8%, 17.7%, 14.7% and 11.2%, respectively. Therefore no seasonal variation was found in the incidence of norovirus infection. However when the monthly prevalence was analyzed, a statistically significant difference was found (p< 0.05) between the rate of norovirus infection in july (24.2%) and december (4.1%). When evaluating the clinical symptoms, all of 141 patients (100%) had diarrhoea, while 72 (51.1%) had vomiting. Stool samples were also evaluated for the presence of parasitic and bacterial agents. Coinfection rate with parasites was detected as 3.3% (4/122; norovirus + Entamoeba histolytica in three cases, norovirus + Enterobius vermicularis in one case), while no pathogenic bacteria were isolated from norovirus positive stool samples. The prevalence rate of 14.1% for NoV GI/GII infection detected in this retrospective study including 0-5 years old children in Ankara for 2004-2011 period was thought to reflect the regional data and would contribute to national epidemiological data. We anticipate that the incidence of norovirus will increase in 0-5 year old children as a result of increasing use of rotavirus vaccine in Turkish children. It was concluded that, NoV antigen detection tests should be used in routine laboratories for appropriate diagnosis of sporadic and/or epidemic norovirus infections.

  13. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  14. [Skin cell response after jellyfish sting].

    PubMed

    Adamicová, Katarína; Výbohová, Desanka; Fetisovová, Želmíra; Nováková, Elena; Mellová, Yvetta

    2016-01-01

    Jellyfish burning is not commonly part of the professional finding in the central Europe health care laboratory. Holiday seaside tourism includes different and unusual presentations of diseases for our worklplaces. Sea water-sports and leisure is commonly connected with jellyfish burning and changes in the skin, that are not precisely described. Authors focused their research on detection of morphological and quantitative changes of some inflammatory cells in the skin biopsy of a 59-years-old woman ten days after a jellyfish stinging. Because of a comparison of findings the biopsy was performed in the skin with lesional and nonlesional skin. Both excisions of the skin were tested by imunohistochemical methods to detect CD68, CD163, CD30, CD4, CD3, CD8, CD20 a CD1a, to detect histiocytes, as well as several clones of lymphocytes and Langerhans cells (antigen presenting cells of skin), CD 117, toluidin blue and chloracetase esterase to detect mastocytes and neutrophils. Material was tested by immunofluorescent methods to detect IgA, IgM, IgG, C3, C4, albumin and fibrinogen. Representative view-fields were documented by microscope photocamera Leica DFC 420 C. Registered photos from both samples of the skin were processed by morphometrical analysis by the Vision Assistant software. A student t-test was used for statistical analysis of reached results. Mean values of individual found cells in the sample with lesion and without lesion were as follows: CD117 -2.64/0.37, CD68-6.86/1.63, CD163-3.13/2.23, CD30-1.36/0.02, CD4-3.51/0.32, CD8-8.22/0.50, CD3-10.69/0.66, CD20-0.56/0.66, CD1a-7.97/0.47 respectively. Generally mild elevation of eosinofils in lesional skin was detected. Increased values of tested cells seen in excision from lesional skin when compared with nonlesional ones were statistically significant in eight case at the level p = 0.033 to 0.001. A not statistically significant difference was found only in the group of CD163+ histiocytes. Authors detected numbers of inflammatory cells in lesional skin after the stinging by a jellyfish and compared them with the numbers of cells in the nonlesional skin of the same patient. Statistically significant differences were seen in the level of selected inflammation cells and numerically documented changes of cellularity in the inflammatory focus were caused by a hypersensitivity reaction after jellyfish injury in the period of 10 days after attack.

  15. Exploiting the full power of temporal gene expression profiling through a new statistical test: application to the analysis of muscular dystrophy data.

    PubMed

    Vinciotti, Veronica; Liu, Xiaohui; Turk, Rolf; de Meijer, Emile J; 't Hoen, Peter A C

    2006-04-03

    The identification of biologically interesting genes in a temporal expression profiling dataset is challenging and complicated by high levels of experimental noise. Most statistical methods used in the literature do not fully exploit the temporal ordering in the dataset and are not suited to the case where temporal profiles are measured for a number of different biological conditions. We present a statistical test that makes explicit use of the temporal order in the data by fitting polynomial functions to the temporal profile of each gene and for each biological condition. A Hotelling T2-statistic is derived to detect the genes for which the parameters of these polynomials are significantly different from each other. We validate the temporal Hotelling T2-test on muscular gene expression data from four mouse strains which were profiled at different ages: dystrophin-, beta-sarcoglycan and gamma-sarcoglycan deficient mice, and wild-type mice. The first three are animal models for different muscular dystrophies. Extensive biological validation shows that the method is capable of finding genes with temporal profiles significantly different across the four strains, as well as identifying potential biomarkers for each form of the disease. The added value of the temporal test compared to an identical test which does not make use of temporal ordering is demonstrated via a simulation study, and through confirmation of the expression profiles from selected genes by quantitative PCR experiments. The proposed method maximises the detection of the biologically interesting genes, whilst minimising false detections. The temporal Hotelling T2-test is capable of finding relatively small and robust sets of genes that display different temporal profiles between the conditions of interest. The test is simple, it can be used on gene expression data generated from any experimental design and for any number of conditions, and it allows fast interpretation of the temporal behaviour of genes. The R code is available from V.V. The microarray data have been submitted to GEO under series GSE1574 and GSE3523.

  16. Exploiting the full power of temporal gene expression profiling through a new statistical test: Application to the analysis of muscular dystrophy data

    PubMed Central

    Vinciotti, Veronica; Liu, Xiaohui; Turk, Rolf; de Meijer, Emile J; 't Hoen, Peter AC

    2006-01-01

    Background The identification of biologically interesting genes in a temporal expression profiling dataset is challenging and complicated by high levels of experimental noise. Most statistical methods used in the literature do not fully exploit the temporal ordering in the dataset and are not suited to the case where temporal profiles are measured for a number of different biological conditions. We present a statistical test that makes explicit use of the temporal order in the data by fitting polynomial functions to the temporal profile of each gene and for each biological condition. A Hotelling T2-statistic is derived to detect the genes for which the parameters of these polynomials are significantly different from each other. Results We validate the temporal Hotelling T2-test on muscular gene expression data from four mouse strains which were profiled at different ages: dystrophin-, beta-sarcoglycan and gamma-sarcoglycan deficient mice, and wild-type mice. The first three are animal models for different muscular dystrophies. Extensive biological validation shows that the method is capable of finding genes with temporal profiles significantly different across the four strains, as well as identifying potential biomarkers for each form of the disease. The added value of the temporal test compared to an identical test which does not make use of temporal ordering is demonstrated via a simulation study, and through confirmation of the expression profiles from selected genes by quantitative PCR experiments. The proposed method maximises the detection of the biologically interesting genes, whilst minimising false detections. Conclusion The temporal Hotelling T2-test is capable of finding relatively small and robust sets of genes that display different temporal profiles between the conditions of interest. The test is simple, it can be used on gene expression data generated from any experimental design and for any number of conditions, and it allows fast interpretation of the temporal behaviour of genes. The R code is available from V.V. The microarray data have been submitted to GEO under series GSE1574 and GSE3523. PMID:16584545

  17. Prospective multi-center study of an automatic online seizure detection system for epilepsy monitoring units.

    PubMed

    Fürbass, F; Ossenblok, P; Hartmann, M; Perko, H; Skupch, A M; Lindinger, G; Elezi, L; Pataraia, E; Colon, A J; Baumgartner, C; Kluge, T

    2015-06-01

    A method for automatic detection of epileptic seizures in long-term scalp-EEG recordings called EpiScan will be presented. EpiScan is used as alarm device to notify medical staff of epilepsy monitoring units (EMUs) in case of a seizure. A prospective multi-center study was performed in three EMUs including 205 patients. A comparison between EpiScan and the Persyst seizure detector on the prospective data will be presented. In addition, the detection results of EpiScan on retrospective EEG data of 310 patients and the public available CHB-MIT dataset will be shown. A detection sensitivity of 81% was reached for unequivocal electrographic seizures with false alarm rate of only 7 per day. No statistical significant differences in the detection sensitivities could be found between the centers. The comparison to the Persyst seizure detector showed a lower false alarm rate of EpiScan but the difference was not of statistical significance. The automatic seizure detection method EpiScan showed high sensitivity and low false alarm rate in a prospective multi-center study on a large number of patients. The application as seizure alarm device in EMUs becomes feasible and will raise the efficiency of video-EEG monitoring and the safety levels of patients. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Detecting altered connectivity patterns in HIV associated neurocognitive impairment using mutual connectivity analysis

    NASA Astrophysics Data System (ADS)

    Abidin, Anas Zainul; D'Souza, Adora M.; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    The use of functional Magnetic Resonance Imaging (fMRI) has provided interesting insights into our understanding of the brain. In clinical setups these scans have been used to detect and study changes in the brain network properties in various neurological disorders. A large percentage of subjects infected with HIV present cognitive deficits, which are known as HIV associated neurocognitive disorder (HAND). In this study we propose to use our novel technique named Mutual Connectivity Analysis (MCA) to detect differences in brain networks in subjects with and without HIV infection. Resting state functional MRI scans acquired from 10 subjects (5 HIV+ and 5 HIV-) were subject to standard preprocessing routines. Subsequently, the average time-series for each brain region of the Automated Anatomic Labeling (AAL) atlas are extracted and used with the MCA framework to obtain a graph characterizing the interactions between them. The network graphs obtained for different subjects are then compared using Network-Based Statistics (NBS), which is an approach to detect differences between graphs edges while controlling for the family-wise error rate when mass univariate testing is performed. Applying this approach on the graphs obtained yields a single network encompassing 42 nodes and 65 edges, which is significantly different between the two subject groups. Specifically connections to the regions in and around the basal ganglia are significantly decreased. Also some nodes corresponding to the posterior cingulate cortex are affected. These results are inline with our current understanding of pathophysiological mechanisms of HIV associated neurocognitive disease (HAND) and other HIV based fMRI connectivity studies. Hence, we illustrate the applicability of our novel approach with network-based statistics in a clinical case-control study to detect differences connectivity patterns.

  19. An incremental knowledge assimilation system (IKAS) for mine detection

    NASA Astrophysics Data System (ADS)

    Porway, Jake; Raju, Chaitanya; Varadarajan, Karthik Mahesh; Nguyen, Hieu; Yadegar, Joseph

    2010-04-01

    In this paper we present an adaptive incremental learning system for underwater mine detection and classification that utilizes statistical models of seabed texture and an adaptive nearest-neighbor classifier to identify varied underwater targets in many different environments. The first stage of processing uses our Background Adaptive ANomaly detector (BAAN), which identifies statistically likely target regions using Gabor filter responses over the image. Using this information, BAAN classifies the background type and updates its detection using background-specific parameters. To perform classification, a Fully Adaptive Nearest Neighbor (FAAN) determines the best label for each detection. FAAN uses an extremely fast version of Nearest Neighbor to find the most likely label for the target. The classifier perpetually assimilates new and relevant information into its existing knowledge database in an incremental fashion, allowing improved classification accuracy and capturing concept drift in the target classes. Experiments show that the system achieves >90% classification accuracy on underwater mine detection tasks performed on synthesized datasets provided by the Office of Naval Research. We have also demonstrated that the system can incrementally improve its detection accuracy by constantly learning from new samples.

  20. Detecting Mixtures from Structural Model Differences Using Latent Variable Mixture Modeling: A Comparison of Relative Model Fit Statistics

    ERIC Educational Resources Information Center

    Henson, James M.; Reise, Steven P.; Kim, Kevin H.

    2007-01-01

    The accuracy of structural model parameter estimates in latent variable mixture modeling was explored with a 3 (sample size) [times] 3 (exogenous latent mean difference) [times] 3 (endogenous latent mean difference) [times] 3 (correlation between factors) [times] 3 (mixture proportions) factorial design. In addition, the efficacy of several…

  1. Evaluation of 3M™ Molecular Detection Assay (MDA) Listeria for the Detection of Listeria species in Selected Foods and Environmental Surfaces: Collaborative Study, First Action 2014.06.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Monteroso, Lisa; Benesh, DeAnn

    2015-01-01

    The 3M™ Molecular Detection Assay (MDA) Listeria is used with the 3M Molecular Detection System for the detection of Listeria species in food, food-related, and environmental samples after enrichment. The assay utilizes loop-mediated isothermal amplification to rapidly amplify Listeria target DNA with high specificity and sensitivity, combined with bioluminescence to detect the amplification. The 3M MDA Listeria method was evaluated using an unpaired study design in a multilaboratory collaborative study and compared to the AOAC Official Method of AnalysisSM (OMA) 993.12 Listeria monocytogenes in Milk and Dairy Products reference method for the detection of Listeria species in full-fat (4% milk fat) cottage cheese (25 g test portions). A total of 15 laboratories located in the continental United States and Canada participated. Each matrix had three inoculation levels: an uninoculated control level (0 CFU/test portion), and two levels artificially contaminated with Listeria monocytogenes, a low inoculum level (0.2-2 CFU/test portion) and a high inoculum level (2-5 CFU/test portion) using nonheat-stressed cells. In total, 792 unpaired replicate portions were analyzed. Statistical analysis was conducted according to the probability of detection (POD) model. Results obtained for the low inoculum level test portions produced a difference in cross-laboratory POD value of -0.07 with a 95% confidence interval of (-0.19, 0.06). No statistically significant differences were observed in the number of positive samples detected by the 3M MDA Listeria method versus the AOAC OMA method.

  2. [Comparison between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection].

    PubMed

    Sun, Zong-ke; Wu, Rong; Ding, Pei; Xue, Jin-Rong

    2006-07-01

    To compare between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection. Using inoculated and real water samples to compare the equivalence and false positive rate between two methods. Results demonstrate that enzyme substrate technique shows equivalence with multiple-tube fermentation technique (P = 0.059), false positive rate between the two methods has no statistical difference. It is suggested that enzyme substrate technique can be used as a standard method for water microbiological safety evaluation.

  3. Detection of calcification clusters in digital breast tomosynthesis slices at different dose levels utilizing a SRSAR reconstruction and JAFROC

    NASA Astrophysics Data System (ADS)

    Timberg, P.; Dustler, M.; Petersson, H.; Tingberg, A.; Zackrisson, S.

    2015-03-01

    Purpose: To investigate detection performance for calcification clusters in reconstructed digital breast tomosynthesis (DBT) slices at different dose levels using a Super Resolution and Statistical Artifact Reduction (SRSAR) reconstruction method. Method: Simulated calcifications with irregular profile (0.2 mm diameter) where combined to form clusters that were added to projection images (1-3 per abnormal image) acquired on a DBT system (Mammomat Inspiration, Siemens). The projection images were dose reduced by software to form 35 abnormal cases and 25 normal cases as if acquired at 100%, 75% and 50% dose level (AGD of approximately 1.6 mGy for a 53 mm standard breast, measured according to EUREF v0.15). A standard FBP and a SRSAR reconstruction method (utilizing IRIS (iterative reconstruction filters), and outlier detection using Maximum-Intensity Projections and Average-Intensity Projections) were used to reconstruct single central slices to be used in a Free-response task (60 images per observer and dose level). Six observers participated and their task was to detect the clusters and assign confidence rating in randomly presented images from the whole image set (balanced by dose level). Each trial was separated by one weeks to reduce possible memory bias. The outcome was analyzed for statistical differences using Jackknifed Alternative Free-response Receiver Operating Characteristics. Results: The results indicate that it is possible reduce the dose by 50% with SRSAR without jeopardizing cluster detection. Conclusions: The detection performance for clusters can be maintained at a lower dose level by using SRSAR reconstruction.

  4. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  5. Development of a methodology for the detection of hospital financial outliers using information systems.

    PubMed

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Detection of Clostridium difficile infection clusters, using the temporal scan statistic, in a community hospital in southern Ontario, Canada, 2006-2011.

    PubMed

    Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott

    2014-05-12

    In hospitals, Clostridium difficile infection (CDI) surveillance relies on unvalidated guidelines or threshold criteria to identify outbreaks. This can result in false-positive and -negative cluster alarms. The application of statistical methods to identify and understand CDI clusters may be a useful alternative or complement to standard surveillance techniques. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting CDI clusters and determine if there are significant differences in the rate of CDI cases by month, season, and year in a community hospital. Bacteriology reports of patients identified with a CDI from August 2006 to February 2011 were collected. For patients detected with CDI from March 2010 to February 2011, stool specimens were obtained. Clostridium difficile isolates were characterized by ribotyping and investigated for the presence of toxin genes by PCR. CDI clusters were investigated using a retrospective temporal scan test statistic. Statistically significant clusters were compared to known CDI outbreaks within the hospital. A negative binomial regression model was used to identify associations between year, season, month and the rate of CDI cases. Overall, 86 CDI cases were identified. Eighteen specimens were analyzed and nine ribotypes were classified with ribotype 027 (n = 6) the most prevalent. The temporal scan statistic identified significant CDI clusters at the hospital (n = 5), service (n = 6), and ward (n = 4) levels (P ≤ 0.05). Three clusters were concordant with the one C. difficile outbreak identified by hospital personnel. Two clusters were identified as potential outbreaks. The negative binomial model indicated years 2007-2010 (P ≤ 0.05) had decreased CDI rates compared to 2006 and spring had an increased CDI rate compared to the fall (P = 0.023). Application of the temporal scan statistic identified several clusters, including potential outbreaks not detected by hospital personnel. The identification of time periods with decreased or increased CDI rates may have been a result of specific hospital events. Understanding the clustering of CDIs can aid in the interpretation of surveillance data and lead to the development of better early detection systems.

  7. Incremental yield of dysplasia detection in Barrett's esophagus using volumetric laser endomicroscopy with and without laser marking compared with a standardized random biopsy protocol.

    PubMed

    Alshelleh, Mohammad; Inamdar, Sumant; McKinley, Matthew; Stewart, Molly; Novak, Jeffrey S; Greenberg, Ronald E; Sultan, Keith; Devito, Bethany; Cheung, Mary; Cerulli, Maurice A; Miller, Larry S; Sejpal, Divyesh V; Vegesna, Anil K; Trindade, Arvind J

    2018-02-02

    Volumetric laser endomicroscopy (VLE) is a new wide-field advanced imaging technology for Barrett's esophagus (BE). No data exist on incremental yield of dysplasia detection. Our aim is to report the incremental yield of dysplasia detection in BE using VLE. This is a retrospective study from a prospectively maintained database from 2011 to 2017 comparing the dysplasia yield of 4 different surveillance strategies in an academic BE tertiary care referral center. The groups were (1) random biopsies (RB), (2) Seattle protocol random biopsies (SP), (3) VLE without laser marking (VLE), and (4) VLE with laser marking (VLEL). A total of 448 consecutive patients (79 RB, 95 SP, 168 VLE, and 106 VLEL) met the inclusion criteria. After adjusting for visible lesions, the total dysplasia yield was 5.7%, 19.6%, 24.8%, and 33.7%, respectively. When compared with just the SP group, the VLEL group had statistically higher rates of overall dysplasia yield (19.6% vs 33.7%, P = .03; odds ratio, 2.1, P = .03). Both the VLEL and VLE groups had statistically significant differences in neoplasia (high-grade dysplasia and intramucosal cancer) detection compared with the SP group (14% vs 1%, P = .001 and 11% vs 1%, P = .003). A surveillance strategy involving VLEL led to a statistically significant higher yield of dysplasia and neoplasia detection compared with a standard random biopsy protocol. These results support the use of VLEL for surveillance in BE in academic centers. Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  8. 18F-choline PET/MRI in suspected recurrence of prostate carcinoma.

    PubMed

    Riola-Parada, C; Carreras-Delgado, J L; Pérez-Dueñas, V; Garcerant-Tafur, M; García-Cañamaque, L

    2018-05-21

    To evaluate the usefulness of simultaneous 18 F-choline PET/MRI in the suspicion of prostate cancer recurrence and to relate 18 F-choline PET/MRI detection rate with analytical and pathological variables. 27 patients with prostate cancer who received local therapy as primary treatment underwent a 18 F-choline PET/MRI due to suspicion of recurrence (persistently rising serum PSA level). 18 F-choline PET/MRI findings were validated by anatomopathological analysis, other imaging tests or by biochemical response to oncological treatment. 18 F-choline PET/MRI detected disease in 15 of 27 patients (detection rate 55.56%). 4 (15%) presented exclusively local recurrence, 5 (18%) lymph node metastases and 7 (26%) bone metastases. Mean PSA (PSA med ) at study time was 2.94ng/mL (range 0.18-10ng/mL). PSA med in patients with positive PET/MRI was 3.70ng/mL (range 0.24-10ng/mL), higher than in patients with negative PET/MRI, PSA med 1.97ng/mL (range 0.18-4.38ng/mL), although without statistically significant differences. Gleason score at diagnosis in patients with a positive study was 7.33 (range 6-9) and in patients with a negative study was 7 (range 6-9), without statistically significant differences. 18 F-choline PET/MRI detection rate was considerable despite the relatively low PSA values in our sample. The influence of Gleason score and PSA level on 18 F-choline PET/MRI detection rate was not statistically significant. Copyright © 2018 Sociedad Española de Medicina Nuclear e Imagen Molecular. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Is maternal serum triple screening a better predictor of Down syndrome in female than in male fetuses?

    PubMed

    Ghidini, A; Spong, C Y; Grier, R E; Walker, C N; Pezzullo, J C

    1998-02-01

    Among euploid gestations, female fetuses have been reported to have significantly lower maternal serum alpha-fetoprotein (MSAFP) and higher human chorionic gonadotropin (hCG) levels than male fetuses. Since in maternal serum triple screening, low MSAFP and high hCG MOM independently confer greater risk of a Down syndrome fetus, we investigated the hypothesis that maternal serum triple screening is more efficacious at detecting female than male Down syndrome fetuses. A database containing all karyotypes from amniocentesis performed between August 1994 and August 1996 was accessed. All trisomy 21 cases were identified. The male-to-female ratio among trisomy 21 fetuses detected at amniocentesis after abnormal maternal serum triple screening was compared with that among trisomy 21 fetuses detected at amniocentesis for advanced maternal age (AMA), which served as the control group. Statistical analysis utilized chi-square, Fisher's exact test, and Student's t-test. A P value of less than 0.05 was considered statistically significant. Forty-nine trisomy 21 fetuses were detected in the women who underwent amniocentesis because of abnormal triple screening and 311 were detected in the control group. The proportion of male fetuses among the triple screening group was not significantly different from that of the AMA group (55 per cent vs. 57 per cent; P=0.9). Our study had a power of 80 per cent to detect a difference of 25 per cent in the male-to-female ratio (alpha=0.05, beta=0.20). The reported differences in MSAFP and hCG levels between male and female euploid fetuses do not appear to affect the sex ratio among Down syndrome fetuses detected because of an abnormal maternal serum triple screening.

  10. Signal Statistics and Maximum Likelihood Sequence Estimation in Intensity Modulated Fiber Optic Links Containing a Single Optical Pre-amplifier.

    PubMed

    Alić, Nikola; Papen, George; Saperstein, Robert; Milstein, Laurence; Fainman, Yeshaiahu

    2005-06-13

    Exact signal statistics for fiber-optic links containing a single optical pre-amplifier are calculated and applied to sequence estimation for electronic dispersion compensation. The performance is evaluated and compared with results based on the approximate chi-square statistics. We show that detection in existing systems based on exact statistics can be improved relative to using a chi-square distribution for realistic filter shapes. In contrast, for high-spectral efficiency systems the difference between the two approaches diminishes, and performance tends to be less dependent on the exact shape of the filter used.

  11. Evaluation of detection methods for screening meat and poultry products for the presence of foodborne pathogens.

    PubMed

    Bohaychuk, Valerie M; Gensler, Gary E; King, Robin K; Wu, John T; McMullen, Lynn M

    2005-12-01

    Rapid and molecular technologies such as enzyme-linked immunosorbent assay (ELISA), PCR, and lateral flow immunoprecipitation can reduce the time and labor involved in screening food products for the presence of pathogens. These technologies were compared with conventional culture methodology for the detection of Salmonella, Campylobacter, Listeria, and Escherichia coli O157:H7 inoculated in raw and processed meat and poultry products. Recommended protocols were modified so that the same enrichment broths used in the culture methods were also used in the ELISA, PCR, and lateral flow immunoprecipitation assays. The percent agreement between the rapid technologies and culture methods ranged from 80 to 100% depending on the pathogen detected and the method used. ELISA, PCR, and lateral flow immunoprecipitation all performed well, with no statistical difference, compared with the culture method for the detection of E. coli O157:H7. ELISA performed better for the detection of Salmonella, with sensitivity and specificity rates of 100%. PCR performed better for the detection of Campylobacter jejuni, with 100% agreement to the culture method. PCR was highly sensitive for the detection of all the foodborne pathogens tested except Listeria monocytogenes. Although the lateral flow immunoprecipitation tests were statistically different from the culture methods for Salmonella and Listeria because of false-positive results, the tests did not produce any false negatives, indicating that this method would be suitable for screening meat and poultry products for these pathogens.

  12. Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?

    PubMed Central

    Tressoldi, Patrizio E.

    2012-01-01

    The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215

  13. In Response: Biological arguments for selecting effect sizes in ecotoxicological testing—A governmental perspective

    USGS Publications Warehouse

    Mebane, Christopher A.

    2015-01-01

    Criticisms of the uses of the no-observed-effect concentration (NOEC) and the lowest-observed-effect concentration (LOEC) and more generally the entire null hypothesis statistical testing scheme are hardly new or unique to the field of ecotoxicology [1-4]. Among the criticisms of NOECs and LOECs is that statistically similar LOECs (in terms of p value) can represent drastically different levels of effect. For instance, my colleagues and I found that a battery of chronic toxicity tests with different species and endpoints yielded LOECs with minimum detectable differences ranging from 3% to 48% reductions from controls [5].

  14. Detecting differential DNA methylation from sequencing of bisulfite converted DNA of diverse species.

    PubMed

    Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V

    2017-07-21

    DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.

  15. A likelihood ratio test for evolutionary rate shifts and functional divergence among proteins

    PubMed Central

    Knudsen, Bjarne; Miyamoto, Michael M.

    2001-01-01

    Changes in protein function can lead to changes in the selection acting on specific residues. This can often be detected as evolutionary rate changes at the sites in question. A maximum-likelihood method for detecting evolutionary rate shifts at specific protein positions is presented. The method determines significance values of the rate differences to give a sound statistical foundation for the conclusions drawn from the analyses. A statistical test for detecting slowly evolving sites is also described. The methods are applied to a set of Myc proteins for the identification of both conserved sites and those with changing evolutionary rates. Those positions with conserved and changing rates are related to the structures and functions of their proteins. The results are compared with an earlier Bayesian method, thereby highlighting the advantages of the new likelihood ratio tests. PMID:11734650

  16. A critique of Rasch residual fit statistics.

    PubMed

    Karabatsos, G

    2000-01-01

    In test analysis involving the Rasch model, a large degree of importance is placed on the "objective" measurement of individual abilities and item difficulties. The degree to which the objectivity properties are attained, of course, depends on the degree to which the data fit the Rasch model. It is therefore important to utilize fit statistics that accurately and reliably detect the person-item response inconsistencies that threaten the measurement objectivity of persons and items. Given this argument, it is somewhat surprising that there is far more emphasis placed in the objective measurement of person and items than there is in the measurement quality of Rasch fit statistics. This paper provides a critical analysis of the residual fit statistics of the Rasch model, arguably the most often used fit statistics, in an effort to illustrate that the task of Rasch fit analysis is not as simple and straightforward as it appears to be. The faulty statistical properties of the residual fit statistics do not allow either a convenient or a straightforward approach to Rasch fit analysis. For instance, given a residual fit statistic, the use of a single minimum critical value for misfit diagnosis across different testing situations, where the situations vary in sample and test properties, leads to both the overdetection and underdetection of misfit. To improve this situation, it is argued that psychometricians need to implement residual-free Rasch fit statistics that are based on the number of Guttman response errors, or use indices that are statistically optimal in detecting measurement disturbances.

  17. Regression equations for sex and population detection using the lip print pattern among Egyptian and Malaysian adult.

    PubMed

    Abdel Aziz, Manal H; Badr El Dine, Fatma M M; Saeed, Nourhan M M

    2016-11-01

    Identification of sex and ethnicity has always been a challenge in the fields of forensic medicine and criminal investigations. Fingerprinting and DNA comparisons are probably the most common techniques used in this context. However, since they cannot always be used, it is necessary to apply different and less known techniques such as lip prints. Is to study the pattern of lip print in Egyptian and Malaysian populations and its relation to sex and populations difference. Also, to develop equations for sex and populations detection using lip print pattern by different populations (Egyptian and Malaysian). The sample comprised of 120 adults volunteers divided into two ethnic groups; sixty adult Egyptians (30 males and 30 females) and sixty adult Malaysians (30 males and 30 females). The lip prints were collected on a white paper. Each lip print was divided into four compartments and were classified and scored according to Suzuki and Tsuchihashi classification. Data were statistically analyzed. The results showed that type III lip print pattern (intersected grooves) was the predominant type in both the Egyptian and Malaysian populations. Type II and III were the most frequent in Egyptian males (28.3% each), while in Egyptian females type III pattern was predominant (46.7%). As regards Malaysian males, type III lip print pattern was the predominant one (41.7%), while type II lip print pattern was predominant (30.8%) in Malaysian females. Statistical analysis of different quadrants showed significant differences between males and females in the Egyptian population in the third and fourth quadrants. On the other hand, significant differences were detected only in the second quadrant between Malaysian males and females. Also, a statistically significant difference was present in the second quadrant between Egyptian and Malaysian males. Using the regression analysis, four regression equations were obtained. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. A novel chi-square statistic for detecting group differences between pathways in systems epidemiology.

    PubMed

    Yuan, Zhongshang; Ji, Jiadong; Zhang, Tao; Liu, Yi; Zhang, Xiaoshuai; Chen, Wei; Xue, Fuzhong

    2016-12-20

    Traditional epidemiology often pays more attention to the identification of a single factor rather than to the pathway that is related to a disease, and therefore, it is difficult to explore the disease mechanism. Systems epidemiology aims to integrate putative lifestyle exposures and biomarkers extracted from multiple omics platforms to offer new insights into the pathway mechanisms that underlie disease at the human population level. One key but inadequately addressed question is how to develop powerful statistics to identify whether one candidate pathway is associated with a disease. Bearing in mind that a pathway difference can result from not only changes in the nodes but also changes in the edges, we propose a novel statistic for detecting group differences between pathways, which in principle, captures the nodes changes and edge changes, as well as simultaneously accounting for the pathway structure simultaneously. The proposed test has been proven to follow the chi-square distribution, and various simulations have shown it has better performance than other existing methods. Integrating genome-wide DNA methylation data, we analyzed one real data set from the Bogalusa cohort study and significantly identified a potential pathway, Smoking → SOCS3 → PIK3R1, which was strongly associated with abdominal obesity. The proposed test was powerful and efficient at identifying pathway differences between two groups, and it can be extended to other disciplines that involve statistical comparisons between pathways. The source code in R is available on our website. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Correlation between the different therapeutic properties of Chinese medicinal herbs and delayed luminescence.

    PubMed

    Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang

    2016-03-01

    In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.

  20. An Assessment of Phylogenetic Tools for Analyzing the Interplay Between Interspecific Interactions and Phenotypic Evolution.

    PubMed

    Drury, J P; Grether, G F; Garland, T; Morlon, H

    2018-05-01

    Much ecological and evolutionary theory predicts that interspecific interactions often drive phenotypic diversification and that species phenotypes in turn influence species interactions. Several phylogenetic comparative methods have been developed to assess the importance of such processes in nature; however, the statistical properties of these methods have gone largely untested. Focusing mainly on scenarios of competition between closely-related species, we assess the performance of available comparative approaches for analyzing the interplay between interspecific interactions and species phenotypes. We find that many currently used statistical methods often fail to detect the impact of interspecific interactions on trait evolution, that sister-taxa analyses are particularly unreliable in general, and that recently developed process-based models have more satisfactory statistical properties. Methods for detecting predictors of species interactions are generally more reliable than methods for detecting character displacement. In weighing the strengths and weaknesses of different approaches, we hope to provide a clear guide for empiricists testing hypotheses about the reciprocal effect of interspecific interactions and species phenotypes and to inspire further development of process-based models.

  1. Otolith Trace Element Chemistry of Juvenile Black Rockfish

    NASA Astrophysics Data System (ADS)

    Hardin, W.; Bobko, S. J.; Jones, C. M.

    2002-12-01

    In the summer of 1997 we collected young-of -the-year (YOY) black rockfish, Sebastes melanops, from floating docks and seagrass beds in Newport and Coos Bay, Oregon. Otoliths were extracted from randomly selected fish, sectioned and polished under general laboratory conditions, and cleaned in a class 100 clean room. We used Laser Ablation - Inductively Coupled Mass Spectrometry (LA-ICPMS) to analyze elemental composition of the estuarine phase of the otoliths. While we observed differences in Mn/Ca ratios between the two estuaries, there was no statistical difference in otolith trace element chemistry ratios between estuaries using MANOVA. To determine if laboratory processing of otoliths might have impeded us from detecting differences in otolith chemistry, we conducted a second experiment. Right and left otoliths from 10 additional Coos Bay fish were randomly allocated to two processing methods. The first method was identical to our initial otolith processing, sectioning and polishing under normal laboratory conditions. In the second method, polishing was done in the clean room. For both methods otoliths went through a final cleaning in the clean room and analyzed with LA-ICPMS. While we did not detect statistical differences in element ratios between the two methods, otoliths polished outside the clean room had much higher variances. This increased variance might have lowered our ability to detect differences in otolith chemistry between estuaries. Based on our results, we recommend polishing otoliths under clean room conditions to reduce contamination.

  2. The use of infrared thermography to detect the stages of estrus cycle and ovulation time in anatolian shepherd dogs.

    PubMed

    Olğaç, Kemal Tuna; Akçay, Ergun; Çil, Beste; Uçar, Burak Mehmet; Daşkın, Ali

    2017-01-01

    The aim of the study is to evaluate the effectiveness of thermographic monitoring, using the temperature changes of perianal and perivulvar areas for the determination of estrus in Anatolian Shepherd bitches. Fifteen bitches were used in the study. Blood and vaginal smear samples were collected and thermographic monitoring of perianal and perivulvar areas were carried out starting from proestrus to early diestrus. Also, external signs of estrus were investigated. Smear samples were evaluated by light microscopy after Diff-Quik staining method and superficial and keratinized superficial cells were determined as percentage (S + KS%). Progesterone and luteinizing hormone measurements were done by radioimmunoassay. The difference in temperature between perianal and perivulvar areas was evaluated through thermographic images by FLIR ResearchIR Software. According to the results obtained from the study, differences between progesterone and S + KS% were statistically significant ( P  < 0,05). Although temperature showed increase and decrease with progesterone and S + KS%, the differences were not important statistically ( P  > 0,05). Serum luteinizing hormone levels did not sign any difference ( P  > 0,05). As a result, thermographic monitoring alone is not enough for estrus detection in Anatolian Shepherd bitches. However, it can be used to assist the actual estrus detection technique in terms of providing some foreknowledge by evaluating the differences in temperature.

  3. Care dependency of hospitalized children: testing the Care Dependency Scale for Paediatrics in a cross-cultural comparison.

    PubMed

    Tork, Hanan; Dassen, Theo; Lohrmann, Christa

    2009-02-01

    This paper is a report of a study to examine the psychometric properties of the Care Dependency Scale for Paediatrics in Germany and Egypt and to compare the care dependency of school-age children in both countries. Cross-cultural differences in care dependency of older adults have been documented in the literature, but little is known about the differences and similarities with regard to children's care dependency in different cultures. A convenience sample of 258 school-aged children from Germany and Egypt participated in the study in 2005. The reliability of the Care Dependency Scale for Paediatrics was assessed in terms of internal consistency and interrater reliability. Factor analysis (principal component analysis) was employed to verify the construct validity. A Visual Analogue Scale was used to investigate the criterion-related validity. Good internal consistency was detected both for the Arabic and German versions. Factor analysis revealed one factor for both versions. A Pearson's correlation between the Care Dependency Scale for Paediatrics and Visual Analogue Scale was statistically significant for both versions indicating criterion-related validity. Statistically significant differences between the participants were detected regarding the mean sum score on the Care Dependency Scale for Paediatrics. The Care Dependency Scale for Paediatrics is a reliable and valid tool for assessing the care dependency of children and is recommended for assessing the care dependency of children from different ethnic origins. Differences in care dependency between German and Egyptian children were detected, which might be due to cultural differences.

  4. Frequency of Testing to Detect Visual Field Progression Derived Using a Longitudinal Cohort of Glaucoma Patients.

    PubMed

    Wu, Zhichao; Saunders, Luke J; Daga, Fábio B; Diniz-Filho, Alberto; Medeiros, Felipe A

    2017-06-01

    To determine the time required to detect statistically significant progression for different rates of visual field loss using standard automated perimetry (SAP) when considering different frequencies of testing using a follow-up scheme that resembles clinical practice. Observational cohort study. One thousand seventy-two eyes of 665 patients with glaucoma followed up over an average of 4.3±0.9 years. Participants with 5 or more visual field tests over a 2- to 5-year period were included to derive the longitudinal measurement variability of SAP mean deviation (MD) using linear regressions. Estimates of variability then were used to reconstruct real-world visual field data by computer simulation to evaluate the time required to detect progression for various rates of visual field loss and different frequencies of testing. The evaluation was performed using a follow-up scheme that resembled clinical practice by requiring a set of 2 baseline tests and a confirmatory test to identify progression. Time (in years) required to detect progression. The time required to detect a statistically significant negative MD slope decreased as the frequency of testing increased, albeit not proportionally. For example, 80% of eyes with an MD loss of -2 dB/year would be detected after 3.3, 2.4, and 2.1 years when testing is performed once, twice, and thrice per year, respectively. For eyes with an MD loss of -0.5 dB/year, progression can be detected with 80% power after 7.3, 5.7, and 5.0 years, respectively. This study provides information on the time required to detect progression using MD trend analysis in glaucoma eyes when different testing frequencies are used. The smaller gains in the time to detect progression when testing is increased from twice to thrice per year suggests that obtaining 2 reliable tests at baseline followed by semiannual testing and confirmation of progression through repeat testing in the initial years of follow-up may provide a good compromise for detecting progression, while minimizing the burden on health care resources in clinical practice. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  5. Statistical Traffic Anomaly Detection in Time-Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  6. Statistical Traffic Anomaly Detection in Time Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  7. Observational difference between gamma and X-ray properties of optically dark and bright GRBs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balazs, L. G.; Horvath, I.; Bagoly, Zs.

    2008-05-22

    Using the discriminant analysis of the multivariate statistical analysis we compared the distribution of the physical quantities of the optically dark and bright GRBs, detected by the BAT and XRT on board of the Swift Satellite. We found that the GRBs having detected optical transients (OT) have systematically higher peak fluxes and lower HI column densities than those without OT.

  8. Evaluation of Next-Generation Vision Testers for Aeromedical Certification of Aviation Personnel

    DTIC Science & Technology

    2009-07-01

    measure distant, intermediate, and near acuity. The slides are essentially abbreviated versions of the Early Treatment for Diabetic Retinopathy Study...over, requiring intermediate vision testing and 12 were color deficient. Analysis was designed to detect statistically significant differences between...Vertical Phoria (Right & Left Hyperphoria) Test scores from each of the vision testers were collated and analyzed. Analysis was designed to detect

  9. Scale Comparability between Nonaccommodated and Accommodated Forms of a Statewide High School Assessment: Assessment Using "l[subscript z]" Person-Fit

    ERIC Educational Resources Information Center

    Seo, Dong Gi; Hao, Shiqi

    2016-01-01

    Differential item/test functioning (DIF/DTF) are routine procedures to detect item/test unfairness as an explanation for group performance difference. However, unequal sample sizes and small sample sizes have an impact on the statistical power of the DIF/DTF detection procedures. Furthermore, DIF/DTF cannot be used for two test forms without…

  10. A PLSPM-Based Test Statistic for Detecting Gene-Gene Co-Association in Genome-Wide Association Study with Case-Control Design

    PubMed Central

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods. PMID:23620809

  11. A PLSPM-based test statistic for detecting gene-gene co-association in genome-wide association study with case-control design.

    PubMed

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.

  12. Variance analysis of x-ray CT sinograms in the presence of electronic noise background.

    PubMed

    Ma, Jianhua; Liang, Zhengrong; Fan, Yi; Liu, Yan; Huang, Jing; Chen, Wufan; Lu, Hongbing

    2012-07-01

    Low-dose x-ray computed tomography (CT) is clinically desired. Accurate noise modeling is a fundamental issue for low-dose CT image reconstruction via statistics-based sinogram restoration or statistical iterative image reconstruction. In this paper, the authors analyzed the statistical moments of low-dose CT data in the presence of electronic noise background. The authors first studied the statistical moment properties of detected signals in CT transmission domain, where the noise of detected signals is considered as quanta fluctuation upon electronic noise background. Then the authors derived, via the Taylor expansion, a new formula for the mean-variance relationship of the detected signals in CT sinogram domain, wherein the image formation becomes a linear operation between the sinogram data and the unknown image, rather than a nonlinear operation in the CT transmission domain. To get insight into the derived new formula by experiments, an anthropomorphic torso phantom was scanned repeatedly by a commercial CT scanner at five different mAs levels from 100 down to 17. The results demonstrated that the electronic noise background is significant when low-mAs (or low-dose) scan is performed. The influence of the electronic noise background should be considered in low-dose CT imaging.

  13. Variance analysis of x-ray CT sinograms in the presence of electronic noise background

    PubMed Central

    Ma, Jianhua; Liang, Zhengrong; Fan, Yi; Liu, Yan; Huang, Jing; Chen, Wufan; Lu, Hongbing

    2012-01-01

    Purpose: Low-dose x-ray computed tomography (CT) is clinically desired. Accurate noise modeling is a fundamental issue for low-dose CT image reconstruction via statistics-based sinogram restoration or statistical iterative image reconstruction. In this paper, the authors analyzed the statistical moments of low-dose CT data in the presence of electronic noise background. Methods: The authors first studied the statistical moment properties of detected signals in CT transmission domain, where the noise of detected signals is considered as quanta fluctuation upon electronic noise background. Then the authors derived, via the Taylor expansion, a new formula for the mean–variance relationship of the detected signals in CT sinogram domain, wherein the image formation becomes a linear operation between the sinogram data and the unknown image, rather than a nonlinear operation in the CT transmission domain. To get insight into the derived new formula by experiments, an anthropomorphic torso phantom was scanned repeatedly by a commercial CT scanner at five different mAs levels from 100 down to 17. Results: The results demonstrated that the electronic noise background is significant when low-mAs (or low-dose) scan is performed. Conclusions: The influence of the electronic noise background should be considered in low-dose CT imaging. PMID:22830738

  14. Robust multivariate nonparametric tests for detection of two-sample location shift in clinical trials

    PubMed Central

    Jiang, Xuejun; Guo, Xu; Zhang, Ning; Wang, Bo

    2018-01-01

    This article presents and investigates performance of a series of robust multivariate nonparametric tests for detection of location shift between two multivariate samples in randomized controlled trials. The tests are built upon robust estimators of distribution locations (medians, Hodges-Lehmann estimators, and an extended U statistic) with both unscaled and scaled versions. The nonparametric tests are robust to outliers and do not assume that the two samples are drawn from multivariate normal distributions. Bootstrap and permutation approaches are introduced for determining the p-values of the proposed test statistics. Simulation studies are conducted and numerical results are reported to examine performance of the proposed statistical tests. The numerical results demonstrate that the robust multivariate nonparametric tests constructed from the Hodges-Lehmann estimators are more efficient than those based on medians and the extended U statistic. The permutation approach can provide a more stringent control of Type I error and is generally more powerful than the bootstrap procedure. The proposed robust nonparametric tests are applied to detect multivariate distributional difference between the intervention and control groups in the Thai Healthy Choices study and examine the intervention effect of a four-session motivational interviewing-based intervention developed in the study to reduce risk behaviors among youth living with HIV. PMID:29672555

  15. [The application of the prospective space-time statistic in early warning of infectious disease].

    PubMed

    Yin, Fei; Li, Xiao-Song; Feng, Zi-Jian; Ma, Jia-Qi

    2007-06-01

    To investigate the application of prospective space-time scan statistic in the early stage of detecting infectious disease outbreaks. The prospective space-time scan statistic was tested by mimicking daily prospective analyses of bacillary dysentery data of Chengdu city in 2005 (3212 cases in 102 towns and villages). And the results were compared with that of purely temporal scan statistic. The prospective space-time scan statistic could give specific messages both in spatial and temporal. The results of June indicated that the prospective space-time scan statistic could timely detect the outbreaks that started from the local site, and the early warning message was powerful (P = 0.007). When the merely temporal scan statistic for detecting the outbreak was sent two days later, and the signal was less powerful (P = 0.039). The prospective space-time scan statistic could make full use of the spatial and temporal information in infectious disease data and could timely and effectively detect the outbreaks that start from the local sites. The prospective space-time scan statistic could be an important tool for local and national CDC to set up early detection surveillance systems.

  16. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Bayesian methods for outliers detection in GNSS time series

    NASA Astrophysics Data System (ADS)

    Qianqian, Zhang; Qingming, Gui

    2013-07-01

    This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.

  18. Scan statistics with local vote for target detection in distributed system

    NASA Astrophysics Data System (ADS)

    Luo, Junhai; Wu, Qi

    2017-12-01

    Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.

  19. Effect of censoring trace-level water-quality data on trend-detection capability

    USGS Publications Warehouse

    Gilliom, R.J.; Hirsch, R.M.; Gilroy, E.J.

    1984-01-01

    Monte Carlo experiments were used to evaluate whether trace-level water-quality data that are routinely censored (not reported) contain valuable information for trend detection. Measurements are commonly censored if they fall below a level associated with some minimum acceptable level of reliability (detection limit). Trace-level organic data were simulated with best- and worst-case estimates of measurement uncertainty, various concentrations and degrees of linear trend, and different censoring rules. The resulting classes of data were subjected to a nonparametric statistical test for trend. For all classes of data evaluated, trends were most effectively detected in uncensored data as compared to censored data even when the data censored were highly unreliable. Thus, censoring data at any concentration level may eliminate valuable information. Whether or not valuable information for trend analysis is, in fact, eliminated by censoring of actual rather than simulated data depends on whether the analytical process is in statistical control and bias is predictable for a particular type of chemical analyses.

  20. Damage detection of engine bladed-disks using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Fang, X.; Tang, J.

    2006-03-01

    The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.

  1. Evidence Integration in Natural Acoustic Textures during Active and Passive Listening

    PubMed Central

    Rupp, Andre; Celikel, Tansu

    2018-01-01

    Abstract Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration. PMID:29662943

  2. Evidence Integration in Natural Acoustic Textures during Active and Passive Listening.

    PubMed

    Górska, Urszula; Rupp, Andre; Boubenec, Yves; Celikel, Tansu; Englitz, Bernhard

    2018-01-01

    Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration.

  3. Voxel-based morphometry and automated lobar volumetry: The trade-off between spatial scale and statistical correction

    PubMed Central

    Voormolen, Eduard H.J.; Wei, Corie; Chow, Eva W.C.; Bassett, Anne S.; Mikulis, David J.; Crawley, Adrian P.

    2011-01-01

    Voxel-based morphometry (VBM) and automated lobar region of interest (ROI) volumetry are comprehensive and fast methods to detect differences in overall brain anatomy on magnetic resonance images. However, VBM and automated lobar ROI volumetry have detected dissimilar gray matter differences within identical image sets in our own experience and in previous reports. To gain more insight into how diverging results arise and to attempt to establish whether one method is superior to the other, we investigated how differences in spatial scale and in the need to statistically correct for multiple spatial comparisons influence the relative sensitivity of either technique to group differences in gray matter volumes. We assessed the performance of both techniques on a small dataset containing simulated gray matter deficits and additionally on a dataset of 22q11-deletion syndrome patients with schizophrenia (22q11DS-SZ) vs. matched controls. VBM was more sensitive to simulated focal deficits compared to automated ROI volumetry, and could detect global cortical deficits equally well. Moreover, theoretical calculations of VBM and ROI detection sensitivities to focal deficits showed that at increasing ROI size, ROI volumetry suffers more from loss in sensitivity than VBM. Furthermore, VBM and automated ROI found corresponding GM deficits in 22q11DS-SZ patients, except in the parietal lobe. Here, automated lobar ROI volumetry found a significant deficit only after a smaller subregion of interest was employed. Thus, sensitivity to focal differences is impaired relatively more by averaging over larger volumes in automated ROI methods than by the correction for multiple comparisons in VBM. These findings indicate that VBM is to be preferred over automated lobar-scale ROI volumetry for assessing gray matter volume differences between groups. PMID:19619660

  4. Probing dark energy using convergence power spectrum and bi-spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinda, Bikash R., E-mail: bikash@ctp-jamia.res.in

    Weak lensing convergence statistics is a powerful tool to probe dark energy. Dark energy plays an important role to the structure formation and the effects can be detected through the convergence power spectrum, bi-spectrum etc. One of the most promising and simplest dark energy model is the ΛCDM . However, it is worth investigating different dark energy models with evolving equation of state of the dark energy. In this work, detectability of different dark energy models from ΛCDM model has been explored through convergence power spectrum and bi-spectrum.

  5. Accounting for immunoprecipitation efficiencies in the statistical analysis of ChIP-seq data.

    PubMed

    Bao, Yanchun; Vinciotti, Veronica; Wit, Ernst; 't Hoen, Peter A C

    2013-05-30

    ImmunoPrecipitation (IP) efficiencies may vary largely between different antibodies and between repeated experiments with the same antibody. These differences have a large impact on the quality of ChIP-seq data: a more efficient experiment will necessarily lead to a higher signal to background ratio, and therefore to an apparent larger number of enriched regions, compared to a less efficient experiment. In this paper, we show how IP efficiencies can be explicitly accounted for in the joint statistical modelling of ChIP-seq data. We fit a latent mixture model to eight experiments on two proteins, from two laboratories where different antibodies are used for the two proteins. We use the model parameters to estimate the efficiencies of individual experiments, and find that these are clearly different for the different laboratories, and amongst technical replicates from the same lab. When we account for ChIP efficiency, we find more regions bound in the more efficient experiments than in the less efficient ones, at the same false discovery rate. A priori knowledge of the same number of binding sites across experiments can also be included in the model for a more robust detection of differentially bound regions among two different proteins. We propose a statistical model for the detection of enriched and differentially bound regions from multiple ChIP-seq data sets. The framework that we present accounts explicitly for IP efficiencies in ChIP-seq data, and allows to model jointly, rather than individually, replicates and experiments from different proteins, leading to more robust biological conclusions.

  6. Planetary mass function and planetary systems

    NASA Astrophysics Data System (ADS)

    Dominik, M.

    2011-02-01

    With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.

  7. Detection of proximal caries using quantitative light-induced fluorescence-digital and laser fluorescence: a comparative study.

    PubMed

    Yoon, Hyung-In; Yoo, Min-Jeong; Park, Eun-Jin

    2017-12-01

    The purpose of this study was to evaluate the in vitro validity of quantitative light-induced fluorescence-digital (QLF-D) and laser fluorescence (DIAGNOdent) for assessing proximal caries in extracted premolars, using digital radiography as reference method. A total of 102 extracted premolars with similar lengths and shapes were used. A single operator conducted all the examinations using three different detection methods (bitewing radiography, QLF-D, and DIAGNOdent). The bitewing x-ray scale, QLF-D fluorescence loss (ΔF), and DIAGNOdent peak readings were compared and statistically analyzed. Each method showed an excellent reliability. The correlation coefficient between bitewing radiography and QLF-D, DIAGNOdent were -0.644 and 0.448, respectively, while the value between QLF-D and DIAGNOdent was -0.382. The kappa statistics for bitewing radiography and QLF-D had a higher diagnosis consensus than those for bitewing radiography and DIAGNOdent. The QLF-D was moderately to highly accurate (AUC = 0.753 - 0.908), while DIAGNOdent was moderately to less accurate (AUC = 0.622 - 0.784). All detection methods showed statistically significant correlation and high correlation between the bitewing radiography and QLF-D. QLF-D was found to be a valid and reliable alternative diagnostic method to digital bitewing radiography for in vitro detection of proximal caries.

  8. A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging

    PubMed Central

    Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean

    2010-01-01

    Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222

  9. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Recovering incomplete data using Statistical Multiple Imputations (SMI): a case study in environmental chemistry.

    PubMed

    Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D

    2011-10-15

    This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Real-time detection of deoxyribonucleic acid bases via their negative differential conductance signature.

    PubMed

    Dragoman, D; Dragoman, M

    2009-08-01

    In this Brief Report, we present a method for the real-time detection of the bases of the deoxyribonucleic acid using their signatures in negative differential conductance measurements. The present methods of electronic detection of deoxyribonucleic acid bases are based on a statistical analysis because the electrical currents of the four bases are weak and do not differ significantly from one base to another. In contrast, we analyze a device that combines the accumulated knowledge in nanopore and scanning tunneling detection and which is able to provide very distinctive electronic signatures for the four bases.

  12. Construction and comparative evaluation of different activity detection methods in brain FDG-PET.

    PubMed

    Buchholz, Hans-Georg; Wenzel, Fabian; Gartenschläger, Martin; Thiele, Frank; Young, Stewart; Reuss, Stefan; Schreckenberger, Mathias

    2015-08-18

    We constructed and evaluated reference brain FDG-PET databases for usage by three software programs (Computer-aided diagnosis for dementia (CAD4D), Statistical Parametric Mapping (SPM) and NEUROSTAT), which allow a user-independent detection of dementia-related hypometabolism in patients' brain FDG-PET. Thirty-seven healthy volunteers were scanned in order to construct brain FDG reference databases, which reflect the normal, age-dependent glucose consumption in human brain, using either software. Databases were compared to each other to assess the impact of different stereotactic normalization algorithms used by either software package. In addition, performance of the new reference databases in the detection of altered glucose consumption in the brains of patients was evaluated by calculating statistical maps of regional hypometabolism in FDG-PET of 20 patients with confirmed Alzheimer's dementia (AD) and of 10 non-AD patients. Extent (hypometabolic volume referred to as cluster size) and magnitude (peak z-score) of detected hypometabolism was statistically analyzed. Differences between the reference databases built by CAD4D, SPM or NEUROSTAT were observed. Due to the different normalization methods, altered spatial FDG patterns were found. When analyzing patient data with the reference databases created using CAD4D, SPM or NEUROSTAT, similar characteristic clusters of hypometabolism in the same brain regions were found in the AD group with either software. However, larger z-scores were observed with CAD4D and NEUROSTAT than those reported by SPM. Better concordance with CAD4D and NEUROSTAT was achieved using the spatially normalized images of SPM and an independent z-score calculation. The three software packages identified the peak z-scores in the same brain region in 11 of 20 AD cases, and there was concordance between CAD4D and SPM in 16 AD subjects. The clinical evaluation of brain FDG-PET of 20 AD patients with either CAD4D-, SPM- or NEUROSTAT-generated databases from an identical reference dataset showed similar patterns of hypometabolism in the brain regions known to be involved in AD. The extent of hypometabolism and peak z-score appeared to be influenced by the calculation method used in each software package rather than by different spatial normalization parameters.

  13. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video.

    PubMed

    Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.

  14. Detecting signals of drug-drug interactions in a spontaneous reports database.

    PubMed

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-10-01

    The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug-drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. The additive model correctly identified all four known DDIs by giving a statistically significant (P < 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P < 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P = 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model.

  15. Detecting signals of drug–drug interactions in a spontaneous reports database

    PubMed Central

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-01-01

    Aims The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug–drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Methods Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. Results The additive model correctly identified all four known DDIs by giving a statistically significant (P< 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P< 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P= 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. Conclusions The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model. PMID:17506784

  16. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  17. Uncovering beat deafness: detecting rhythm disorders with synchronized finger tapping and perceptual timing tasks.

    PubMed

    Dalla Bella, Simone; Sowiński, Jakub

    2015-03-16

    A set of behavioral tasks for assessing perceptual and sensorimotor timing abilities in the general population (i.e., non-musicians) is presented here with the goal of uncovering rhythm disorders, such as beat deafness. Beat deafness is characterized by poor performance in perceiving durations in auditory rhythmic patterns or poor synchronization of movement with auditory rhythms (e.g., with musical beats). These tasks include the synchronization of finger tapping to the beat of simple and complex auditory stimuli and the detection of rhythmic irregularities (anisochrony detection task) embedded in the same stimuli. These tests, which are easy to administer, include an assessment of both perceptual and sensorimotor timing abilities under different conditions (e.g., beat rates and types of auditory material) and are based on the same auditory stimuli, ranging from a simple metronome to a complex musical excerpt. The analysis of synchronized tapping data is performed with circular statistics, which provide reliable measures of synchronization accuracy (e.g., the difference between the timing of the taps and the timing of the pacing stimuli) and consistency. Circular statistics on tapping data are particularly well-suited for detecting individual differences in the general population. Synchronized tapping and anisochrony detection are sensitive measures for identifying profiles of rhythm disorders and have been used with success to uncover cases of poor synchronization with spared perceptual timing. This systematic assessment of perceptual and sensorimotor timing can be extended to populations of patients with brain damage, neurodegenerative diseases (e.g., Parkinson's disease), and developmental disorders (e.g., Attention Deficit Hyperactivity Disorder).

  18. A systematic Chandra study of Sgr A⋆: II. X-ray flare statistics

    NASA Astrophysics Data System (ADS)

    Yuan, Qiang; Wang, Q. Daniel; Liu, Siming; Wu, Kinwah

    2018-01-01

    The routinely flaring events from Sgr A⋆ trace dynamic, high-energy processes in the immediate vicinity of the supermassive black hole. We statistically study temporal and spectral properties, as well as fluence and duration distributions, of the flares detected by the Chandra X-ray Observatory from 1999 to 2012. The detection incompleteness and bias are carefully accounted for in determining these distributions. We find that the fluence distribution can be well characterized by a power law with a slope of 1.73^{+0.20}_{-0.19}, while the durations (τ in seconds) by a lognormal function with a mean log (τ)=3.39^{+0.27}_{-0.24} and an intrinsic dispersion σ =0.28^{+0.08}_{-0.06}. No significant correlation between the fluence and duration is detected. The apparent positive correlation, as reported previously, is mainly due to the detection bias (i.e. weak flares can be detected only when their durations are short). These results indicate that the simple self-organized criticality model has difficulties in explaining these flares. We further find that bright flares usually have asymmetric light curves with no statistically evident difference/preference between the rising and decaying phases in terms of their spectral/timing properties. Our spectral analysis shows that although a power-law model with a photon index of 2.0 ± 0.4 gives a satisfactory fit to the joint spectra of strong and weak flares, there is weak evidence for a softer spectrum of weaker flares. This work demonstrates the potential to use statistical properties of X-ray flares to probe their trigger and emission mechanisms, as well as the radiation propagation around the black hole.

  19. Detecting fatigue thresholds from electromyographic signals: A systematic review on approaches and methodologies.

    PubMed

    Ertl, Peter; Kruse, Annika; Tilp, Markus

    2016-10-01

    The aim of the current paper was to systematically review the relevant existing electromyographic threshold concepts within the literature. The electronic databases MEDLINE and SCOPUS were screened for papers published between January 1980 and April 2015 including the keywords: neuromuscular fatigue threshold, anaerobic threshold, electromyographic threshold, muscular fatigue, aerobic-anaerobictransition, ventilatory threshold, exercise testing, and cycle-ergometer. 32 articles were assessed with regard to their electromyographic methodologies, description of results, statistical analysis and test protocols. Only one article was of very good quality. 21 were of good quality and two articles were of very low quality. The review process revealed that: (i) there is consistent evidence of one or two non-linear increases of EMG that might reflect the additional recruitment of motor units (MU) or different fiber types during fatiguing cycle ergometer exercise, (ii) most studies reported no statistically significant difference between electromyographic and metabolic thresholds, (iii) one minute protocols with increments between 10 and 25W appear most appropriate to detect muscular threshold, (iv) threshold detection from the vastus medialis, vastus lateralis, and rectus femoris is recommended, and (v) there is a great variety in study protocols, measurement techniques, and data processing. Therefore, we recommend further research and standardization in the detection of EMGTs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Detection of statistical asymmetries in non-stationary sign time series: Analysis of foreign exchange data

    PubMed Central

    Takayasu, Hideki; Takayasu, Misako

    2017-01-01

    We extend the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze binary sign time series data of price difference from the foreign exchange market. We model segments of the sign time series as Markov sequences and apply a local hypothesis test to evaluate the symmetries of independence and time reversion in different periods of the market. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. Using such analysis, we could not only segment the time series according the different behaviors but also characterize the segments in terms of statistical symmetries. As a particular result, we find that the foreign exchange market is essentially time reversible but this symmetry is broken when there is a strong external influence. PMID:28542208

  1. LROC Investigation of Three Strategies for Reducing the Impact of Respiratory Motion on the Detection of Solitary Pulmonary Nodules in SPECT

    NASA Astrophysics Data System (ADS)

    Smyczynski, Mark S.; Gifford, Howard C.; Dey, Joyoni; Lehovich, Andre; McNamara, Joseph E.; Segars, W. Paul; King, Michael A.

    2016-02-01

    The objective of this investigation was to determine the effectiveness of three motion reducing strategies in diminishing the degrading impact of respiratory motion on the detection of small solitary pulmonary nodules (SPNs) in single-photon emission computed tomographic (SPECT) imaging in comparison to a standard clinical acquisition and the ideal case of imaging in the absence of respiratory motion. To do this nonuniform rational B-spline cardiac-torso (NCAT) phantoms based on human-volunteer CT studies were generated spanning the respiratory cycle for a normal background distribution of Tc-99 m NeoTect. Similarly, spherical phantoms of 1.0-cm diameter were generated to model small SPN for each of the 150 uniquely located sites within the lungs whose respiratory motion was based on the motion of normal structures in the volunteer CT studies. The SIMIND Monte Carlo program was used to produce SPECT projection data from these. Normal and single-lesion containing SPECT projection sets with a clinically realistic Poisson noise level were created for the cases of 1) the end-expiration (EE) frame with all counts, 2) respiration-averaged motion with all counts, 3) one fourth of the 32 frames centered around EE (Quarter Binning), 4) one half of the 32 frames centered around EE (Half Binning), and 5) eight temporally binned frames spanning the respiratory cycle. Each of the sets of combined projection data were reconstructed with RBI-EM with system spatial-resolution compensation (RC). Based on the known motion for each of the 150 different lesions, the reconstructed volumes of respiratory bins were shifted so as to superimpose the locations of the SPN onto that in the first bin (Reconstruct and Shift). Five human observers performed localization receiver operating characteristics (LROC) studies of SPN detection. The observer results were analyzed for statistical significance differences in SPN detection accuracy among the three correction strategies, the standard acquisition, and the ideal case of the absence of respiratory motion. Our human-observer LROC determined that Quarter Binning and Half Binning strategies resulted in SPN detection accuracy statistically significantly below ( ) that of standard clinical acquisition, whereas the Reconstruct and Shift strategy resulted in a detection accuracy not statistically significantly different from that of the ideal case. This investigation demonstrates that tumor detection based on acquisitions associated with less than all the counts which could potentially be employed may result in poorer detection despite limiting the motion of the lesion. The Reconstruct and Shift method results in tumor detection that is equivalent to ideal motion correction.

  2. Contextual Interactions in Grating Plaid Configurations Are Explained by Natural Image Statistics and Neural Modeling

    PubMed Central

    Ernst, Udo A.; Schiffer, Alina; Persike, Malte; Meinhardt, Günter

    2016-01-01

    Processing natural scenes requires the visual system to integrate local features into global object descriptions. To achieve coherent representations, the human brain uses statistical dependencies to guide weighting of local feature conjunctions. Pairwise interactions among feature detectors in early visual areas may form the early substrate of these local feature bindings. To investigate local interaction structures in visual cortex, we combined psychophysical experiments with computational modeling and natural scene analysis. We first measured contrast thresholds for 2 × 2 grating patch arrangements (plaids), which differed in spatial frequency composition (low, high, or mixed), number of grating patch co-alignments (0, 1, or 2), and inter-patch distances (1° and 2° of visual angle). Contrast thresholds for the different configurations were compared to the prediction of probability summation (PS) among detector families tuned to the four retinal positions. For 1° distance the thresholds for all configurations were larger than predicted by PS, indicating inhibitory interactions. For 2° distance, thresholds were significantly lower compared to PS when the plaids were homogeneous in spatial frequency and orientation, but not when spatial frequencies were mixed or there was at least one misalignment. Next, we constructed a neural population model with horizontal laminar structure, which reproduced the detection thresholds after adaptation of connection weights. Consistent with prior work, contextual interactions were medium-range inhibition and long-range, orientation-specific excitation. However, inclusion of orientation-specific, inhibitory interactions between populations with different spatial frequency preferences were crucial for explaining detection thresholds. Finally, for all plaid configurations we computed their likelihood of occurrence in natural images. The likelihoods turned out to be inversely related to the detection thresholds obtained at larger inter-patch distances. However, likelihoods were almost independent of inter-patch distance, implying that natural image statistics could not explain the crowding-like results at short distances. This failure of natural image statistics to resolve the patch distance modulation of plaid visibility remains a challenge to the approach. PMID:27757076

  3. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  4. Statistical Assessment of a Paired-site Approach for Verification of Carbon and Nitrogen Sequestration on CRP Land

    NASA Astrophysics Data System (ADS)

    Kucharik, C.; Roth, J.

    2002-12-01

    The threat of global climate change has provoked policy-makers to consider plausible strategies to slow the accumulation of greenhouse gases, especially carbon dioxide, in the atmosphere. One such idea involves the sequestration of atmospheric carbon (C) in degraded agricultural soils as part of the Conservation Reserve Program (CRP). While the potential for significant C sequestration in CRP grassland ecosystems has been demonstrated, the paired-site sampling approach traditionally used to quantify soil C changes has not been evaluated with robust statistical analysis. In this study, 14 paired CRP (> 8 years old) and cropland sites in Dane County, Wisconsin (WI) were used to assess whether a paired-site sampling design could detect statistically significant differences (ANOVA) in mean soil organic C and total nitrogen (N) storage. We compared surface (0 to 10 cm) bulk density, and sampled soils (0 to 5, 5 to 10, and 10 to 25 cm) for textural differences and chemical analysis of organic matter (OM), soil organic C (SOC), total N, and pH. The CRP contributed to lowering soil bulk density by 13% (p < 0.0001) and increased SOC and OM storage (kg m-2) by 13 to 17% in the 0 to 5 cm layer (p = 0.1). We tested the statistical power associated with ANOVA for measured soil properties, and calculated minimum detectable differences (MDD). We concluded that 40 to 65 paired sites and soil sampling in 5 cm increments near the surface were needed to achieve an 80% confidence level (α = 0.05; β = 0.20) in soil C and N sequestration rates. Because soil C and total N storage was highly variable among these sites (CVs > 20%), only a 23 to 29% change in existing total organic C and N pools could be reliably detected. While C and N sequestration (247 kg C ha{-1 } yr-1 and 17 kg N ha-1 yr-1) may be occurring and confined to the surface 5 cm as part of the WI CRP, our sampling design did not statistically support the desired 80% confidence level. We conclude that usage of statistical power analysis is essential to insure a high level of confidence in soil C and N sequestration rates that are quantified using paired plots.

  5. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size.

    PubMed

    Heidel, R Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  6. A signal detection method for temporal variation of adverse effect with vaccine adverse event reporting system data.

    PubMed

    Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong

    2017-07-05

    To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.

  7. The evolution of supernova remnants in different galactic environments, and its effects on supernova statistics

    NASA Technical Reports Server (NTRS)

    Kafatos, M.; Sofia, S.; Bruhweiler, F.; Gull, T. R.

    1980-01-01

    Examination of the interaction between supernova (SN) ejecta and the various environments in which the explosive event might occur shows that only a small fraction of the many SNs produce observable supernova remnants (SNRs). This fraction, which is found to depend weakly upon the lower mass limit of the SN progenitors, and more strongly on the specfic characteristics of the associated interstellar medium, decreases from approximately 15 percent near the galctic center to 10 percent at Rgal approximately 10 kpc and drops nearly to zero for Rgal 15 kpc. Generally, whether a SNR is detectable is determined by the density of the ambient interstellar medium in which it is embeeede. The presence of large, low density cavities arpund stellar associations due to the combined effects of stellar winds and supernova shells strongly suggests that a large portion of the detectable SNRs have runway stars as their progenitors. These results explain the differences between the substantially larger SN rates in the galaxy derived both from pulsar statistics and from observations of SN events in external galaxies, when compared to the substantially smaller SN rates derived form galactic SNR statistics.

  8. Texture analysis with statistical methods for wheat ear extraction

    NASA Astrophysics Data System (ADS)

    Bakhouche, M.; Cointault, F.; Gouton, P.

    2007-01-01

    In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.

  9. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  10. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  11. Partial Least Squares Regression Can Aid in Detecting Differential Abundance of Multiple Features in Sets of Metagenomic Samples

    PubMed Central

    Libiger, Ondrej; Schork, Nicholas J.

    2015-01-01

    It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061

  12. Statistics for characterizing data on the periphery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James P; Hush, Donald R

    2010-01-01

    We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.

  13. Initial Image Quality and Clinical Experience with New CR Digital Mammography System: A Phantom and Clinical Study

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique; Alfonso, Beatriz Y. Álvarez; Castellanos, Gustavo Casian; Enríquez, Jesús Gabriel Franco

    2008-08-01

    The goal of the study was to evaluate the first CR digital mammography system (® Konica-Minolta) in Mexico in clinical routine for cancer detection in a screening population and to determine if high resolution CR digital imaging is equivalent to state-of-the-art screen-film imaging. The mammograms were evaluated by two observers with cytological or histological confirmation for BIRADS 3, 4 and 5. Contrast, exposure and artifacts of the images were evaluated. Different details like skin, retromamillary space and parenchymal structures were judged. The detectability of microcalcifications and lesions were compared and correlated to histology. The difference in sensitivity of CR Mammography (CRM) and Screen Film Mammography (SFM) was not statistically significant. However, CRM had a significantly lower recall rate, and the lesion detection was equal or superior to conventional images. There is no significant difference in the number of microcalcifications and highly suspicious calcifications were equally detected on both film-screen and digital images. Different anatomical regions were better detectable in digital than in conventional mammography.

  14. Initial Image Quality and Clinical Experience with New CR Digital Mammography System: A Phantom and Clinical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaona, Enrique; Enriquez, Jesus Gabriel Franco; Alfonso, Beatriz Y. Alvarez

    2008-08-11

    The goal of the study was to evaluate the first CR digital mammography system ( registered Konica-Minolta) in Mexico in clinical routine for cancer detection in a screening population and to determine if high resolution CR digital imaging is equivalent to state-of-the-art screen-film imaging. The mammograms were evaluated by two observers with cytological or histological confirmation for BIRADS 3, 4 and 5. Contrast, exposure and artifacts of the images were evaluated. Different details like skin, retromamillary space and parenchymal structures were judged. The detectability of microcalcifications and lesions were compared and correlated to histology. The difference in sensitivity of CRmore » Mammography (CRM) and Screen Film Mammography (SFM) was not statistically significant. However, CRM had a significantly lower recall rate, and the lesion detection was equal or superior to conventional images. There is no significant difference in the number of microcalcifications and highly suspicious calcifications were equally detected on both film-screen and digital images. Different anatomical regions were better detectable in digital than in conventional mammography.« less

  15. Statistical learning and auditory processing in children with music training: An ERP study.

    PubMed

    Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne

    2017-07-01

    The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  16. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  17. Temperature and Voltage Offsets in High- ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2018-06-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high- ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/ n + and p/ p + junctions, selecting appropriate dimensions, doping, and loading.

  18. Temperature and Voltage Offsets in High-ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2017-10-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high-ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/n + and p/p + junctions, selecting appropriate dimensions, doping, and loading.

  19. Meta-analysis of Clinical and Radiographic Outcomes After Arthroscopic Single-Row Versus Double-Row Rotator Cuff Repair

    PubMed Central

    Perser, Karen; Godfrey, David; Bisson, Leslie

    2011-01-01

    Context: Double-row rotator cuff repair methods have improved biomechanical performance when compared with single-row repairs. Objective: To review clinical outcomes of single-row versus double-row rotator cuff repair with the hypothesis that double-row rotator cuff repair will result in better clinical and radiographic outcomes. Data Sources: Published literature from January 1980 to April 2010. Key terms included rotator cuff, prospective studies, outcomes, and suture techniques. Study Selection: The literature was systematically searched, and 5 level I and II studies were found comparing clinical outcomes of single-row and double-row rotator cuff repair. Coleman methodology scores were calculated for each article. Data Extraction: Meta-analysis was performed, with treatment effect between single row and double row for clinical outcomes and with odds ratios for radiographic results. The sample size necessary to detect a given difference in clinical outcome between the 2 methods was calculated. Results: Three level I studies had Coleman scores of 80, 74, and 81, and two level II studies had scores of 78 and 73. There were 156 patients with single-row repairs and 147 patients with double-row repairs, both with an average follow-up of 23 months (range, 12-40 months). Double-row repairs resulted in a greater treatment effect for each validated outcome measure in 4 studies, but the differences were not clinically or statistically significant (range, 0.4-2.2 points; 95% confidence interval, –0.19, 4.68 points). Double-row repairs had better radiographic results, but the differences were also not statistically significant (P = 0.13). Two studies had adequate power to detect a 10-point difference between repair methods using the Constant score, and 1 study had power to detect a 5-point difference using the UCLA (University of California, Los Angeles) score. Conclusions: Double-row rotator cuff repair does not show a statistically significant improvement in clinical outcome or radiographic healing with short-term follow-up. PMID:23016017

  20. Meta-analysis of Clinical and Radiographic Outcomes After Arthroscopic Single-Row Versus Double-Row Rotator Cuff Repair.

    PubMed

    Perser, Karen; Godfrey, David; Bisson, Leslie

    2011-05-01

    Double-row rotator cuff repair methods have improved biomechanical performance when compared with single-row repairs. To review clinical outcomes of single-row versus double-row rotator cuff repair with the hypothesis that double-row rotator cuff repair will result in better clinical and radiographic outcomes. Published literature from January 1980 to April 2010. Key terms included rotator cuff, prospective studies, outcomes, and suture techniques. The literature was systematically searched, and 5 level I and II studies were found comparing clinical outcomes of single-row and double-row rotator cuff repair. Coleman methodology scores were calculated for each article. Meta-analysis was performed, with treatment effect between single row and double row for clinical outcomes and with odds ratios for radiographic results. The sample size necessary to detect a given difference in clinical outcome between the 2 methods was calculated. Three level I studies had Coleman scores of 80, 74, and 81, and two level II studies had scores of 78 and 73. There were 156 patients with single-row repairs and 147 patients with double-row repairs, both with an average follow-up of 23 months (range, 12-40 months). Double-row repairs resulted in a greater treatment effect for each validated outcome measure in 4 studies, but the differences were not clinically or statistically significant (range, 0.4-2.2 points; 95% confidence interval, -0.19, 4.68 points). Double-row repairs had better radiographic results, but the differences were also not statistically significant (P = 0.13). Two studies had adequate power to detect a 10-point difference between repair methods using the Constant score, and 1 study had power to detect a 5-point difference using the UCLA (University of California, Los Angeles) score. Double-row rotator cuff repair does not show a statistically significant improvement in clinical outcome or radiographic healing with short-term follow-up.

  1. An automated multi-scale network-based scheme for detection and location of seismic sources

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.

    2017-12-01

    We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.

  2. Thermographic techniques and adapted algorithms for automatic detection of foreign bodies in food

    NASA Astrophysics Data System (ADS)

    Meinlschmidt, Peter; Maergner, Volker

    2003-04-01

    At the moment foreign substances in food are detected mainly by using mechanical and optical methods as well as ultrasonic technique and than they are removed from the further process. These techniques detect a large portion of the foreign substances due to their different mass (mechanical sieving), their different colour (optical method) and their different surface density (ultrasonic detection). Despite the numerous different methods a considerable portion of the foreign substances remain undetected. In order to recognise materials still undetected, a complementary detection method would be desirable removing the foreign substances not registered by the a.m. methods from the production process. In a project with 13 partner from the food industry, the Fraunhofer - Institut für Holzforschung (WKI) and the Technische Unsiversität are trying to adapt thermography for the detection of foreign bodies in the food industry. After the initial tests turned out to be very promising for the differentiation of food stuffs and foreign substances, more and detailed investigation were carried out to develop suitable algorithms for automatic detection of foreign bodies. In order to achieve -besides the mere visual detection of foreign substances- also an automatic detection under production conditions, numerous experiences in image processing and pattern recognition are exploited. Results for the detection of foreign bodies will be presented at the conference showing the different advantages and disadvantages of using grey - level, statistical and morphological image processing techniques.

  3. Understanding photon sideband statistics and correlation for determining phonon coherence

    NASA Astrophysics Data System (ADS)

    Ding, Ding; Yin, Xiaobo; Li, Baowen

    2018-01-01

    Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.

  4. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Subok; Jennings, Robert; Liu Haimo

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance estimates and comparisons. For the assessment of 3D systems, to accurately determine trade offs between image quality and radiation dose, it is critical to incorporate randomness arising from the imaging chain including background variability into system performance calculations.« less

  5. [Evaluation of the Performance of Two Kinds of Anti-TP Enzyme-Linked Immunosorbent Assay].

    PubMed

    Gao, Nan; Huang, Li-Qin; Wang, Rui; Jia, Jun-Jie; Wu, Shuo; Zhang, Jing; Ge, Hong-Wei

    2018-06-01

    To evaluate the accuracy and precision of 2 kinds of anti-treponema pallidum (anti-TP) ELISA reagents in our laboratory for detecting the anti-TP in voluntary blood donors, so as to provide the data support for use of ELISA reagents after introduction of chemiluminescene immunoassay (CLIA). The route detection of anti-TP was performed by using 2 kinds of ELISA reagents, then 546 responsive positive samples detected by anti-TP ELISA were collected, and the infections status of samples confirmed by treponema pallidum particle agglutination (TPPA) test was identified. The confirmed results of responsive samples detected by 2 kinds of anti-TP ELISA reagents were compared, the accuracy of 2 kinds of anti-TP ELISA reagents was analyzed by drawing ROC and comparing area under curve (AUC), and precision of 2 kinds of anti-TP ELISA reagents was compared by statistical analysis of quality control data from 7.1 2016 to 6.30 2017. There were no statistical difference in confirmed positive rate of responsive samples and weak positive samples between 2 kinds of anti-TP ELISA reagents. The responsive samples detected by 2 kinds of anti-TP ELISA reagents accounted for 85.53%(467/546) of all responsive samples, the positive rate confirmed by TPPA test was 82.87%. 44 responsive samples detected by anti-TP ELISA reagent A and 35 responsive samples detected by anti-TP ELISA reagent B were confirmed to be negative by TPPA test. Comparison of AUC showed that the accuracy of 2 kinds of anti-TP ELISA reagents was more high, the difference between 2 reagents was not statistically significant. The coefficient of variation (CV) of anti-TP ELISA reagent A and B was 14.98% and 18.04% respectively, which met the precision requirement of ELISA test. The accuracy and precision of 2 kinds of anti-TP ELISA reagents used in our laboratory are similar, and using any one of anti-TP ELISA reagents all can satisfy the requirements of blood screening.

  6. OSSOS: X. How to use a Survey Simulator: Statistical Testing of Dynamical Models Against the Real Kuiper Belt

    NASA Astrophysics Data System (ADS)

    Lawler, Samantha M.; Kavelaars, J. J.; Alexandersen, Mike; Bannister, Michele T.; Gladman, Brett; Petit, Jean-Marc; Shankman, Cory

    2018-05-01

    All surveys include observational biases, which makes it impossible to directly compare properties of discovered trans-Neptunian Objects (TNOs) with dynamical models. However, by carefully keeping track of survey pointings on the sky, detection limits, tracking fractions, and rate cuts, the biases from a survey can be modelled in Survey Simulator software. A Survey Simulator takes an intrinsic orbital model (from, for example, the output of a dynamical Kuiper belt emplacement simulation) and applies the survey biases, so that the biased simulated objects can be directly compared with real discoveries. This methodology has been used with great success in the Outer Solar System Origins Survey (OSSOS) and its predecessor surveys. In this chapter, we give four examples of ways to use the OSSOS Survey Simulator to gain knowledge about the true structure of the Kuiper Belt. We demonstrate how to statistically compare different dynamical model outputs with real TNO discoveries, how to quantify detection biases within a TNO population, how to measure intrinsic population sizes, and how to use upper limits from non-detections. We hope this will provide a framework for dynamical modellers to statistically test the validity of their models.

  7. Drug Adverse Event Detection in Health Plan Data Using the Gamma Poisson Shrinker and Comparison to the Tree-based Scan Statistic

    PubMed Central

    Brown, Jeffrey S.; Petronis, Kenneth R.; Bate, Andrew; Zhang, Fang; Dashevsky, Inna; Kulldorff, Martin; Avery, Taliser R.; Davis, Robert L.; Chan, K. Arnold; Andrade, Susan E.; Boudreau, Denise; Gunter, Margaret J.; Herrinton, Lisa; Pawloski, Pamala A.; Raebel, Marsha A.; Roblin, Douglas; Smith, David; Reynolds, Robert

    2013-01-01

    Background: Drug adverse event (AE) signal detection using the Gamma Poisson Shrinker (GPS) is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan). Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds. PMID:24300404

  8. Differences in Looking at Own- and Other-Race Faces Are Subtle and Analysis-Dependent: An Account of Discrepant Reports.

    PubMed

    Arizpe, Joseph; Kravitz, Dwight J; Walsh, Vincent; Yovel, Galit; Baker, Chris I

    2016-01-01

    The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis.

  9. Differences in Looking at Own- and Other-Race Faces Are Subtle and Analysis-Dependent: An Account of Discrepant Reports

    PubMed Central

    Arizpe, Joseph; Kravitz, Dwight J.; Walsh, Vincent; Yovel, Galit; Baker, Chris I.

    2016-01-01

    The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis. PMID:26849447

  10. Detection of Fatty Acids from Intact Microorganisms by Molecular Beam Static Secondary Ion Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Jani Cheri; Lehman, Richard Michael; Bauer, William Francis

    We report the use of a surface analysis approach, static secondary ion mass spectrometry (SIMS) equipped with a molecular (ReO4-) ion primary beam, to analyze the surface of intact microbial cells. SIMS spectra of 28 microorganisms were compared to fatty acid profiles determined by gas chromatographic analysis of transesterfied fatty acids extracted from the same organisms. The results indicate that surface bombardment using the molecular primary beam cleaved the ester linkage characteristic of bacteria at the glycerophosphate backbone of the phospholipid components of the cell membrane. This cleavage enables direct detection of the fatty acid conjugate base of intact microorganismsmore » by static SIMS. The limit of detection for this approach is approximately 107 bacterial cells/cm2. Multivariate statistical methods were applied in a graded approach to the SIMS microbial data. The results showed that the full data set could initially be statistically grouped based upon major differences in biochemical composition of the cell wall. The gram-positive bacteria were further statistically analyzed, followed by final analysis of a specific bacterial genus that was successfully grouped by species. Additionally, the use of SIMS to detect microbes on mineral surfaces is demonstrated by an analysis of Shewanella oneidensis on crushed hematite. The results of this study provide evidence for the potential of static SIMS to rapidly detect bacterial species based on ion fragments originating from cell membrane lipids directly from sample surfaces.« less

  11. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  12. Using permutations to detect dependence between time series

    NASA Astrophysics Data System (ADS)

    Cánovas, Jose S.; Guillamón, Antonio; Ruíz, María del Carmen

    2011-07-01

    In this paper, we propose an independence test between two time series which is based on permutations. The proposed test can be carried out by means of different common statistics such as Pearson’s chi-square or the likelihood ratio. We also point out why an exact test is necessary. Simulated and real data (return exchange rates between several currencies) reveal the capacity of this test to detect linear and nonlinear dependences.

  13. Tables of square-law signal detection statistics for Hann spectra with 50 percent overlap

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. Kent

    1991-01-01

    The Search for Extraterrestrial Intelligence, currently being planned by NASA, will require that an enormous amount of data be analyzed in real time by special purpose hardware. It is expected that overlapped Hann data windows will play an important role in this analysis. In order to understand the statistical implication of this approach, it has been necessary to compute detection statistics for overlapped Hann spectra. Tables of signal detection statistics are given for false alarm rates from 10(exp -14) to 10(exp -1) and signal detection probabilities from 0.50 to 0.99; the number of computed spectra ranges from 4 to 2000.

  14. A space-time scan statistic for detecting emerging outbreaks.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko; Kohriyama, Kazuaki

    2011-03-01

    As a major analytical method for outbreak detection, Kulldorff's space-time scan statistic (2001, Journal of the Royal Statistical Society, Series A 164, 61-72) has been implemented in many syndromic surveillance systems. Since, however, it is based on circular windows in space, it has difficulty correctly detecting actual noncircular clusters. Takahashi et al. (2008, International Journal of Health Geographics 7, 14) proposed a flexible space-time scan statistic with the capability of detecting noncircular areas. It seems to us, however, that the detection of the most likely cluster defined in these space-time scan statistics is not the same as the detection of localized emerging disease outbreaks because the former compares the observed number of cases with the conditional expected number of cases. In this article, we propose a new space-time scan statistic which compares the observed number of cases with the unconditional expected number of cases, takes a time-to-time variation of Poisson mean into account, and implements an outbreak model to capture localized emerging disease outbreaks more timely and correctly. The proposed models are illustrated with data from weekly surveillance of the number of absentees in primary schools in Kitakyushu-shi, Japan, 2006. © 2010, The International Biometric Society.

  15. ANESTHETIC INDUCTION AND RECOVERY PARAMETERS IN BEARDED DRAGONS (POGONA VITTICEPS): COMPARISON OF ISOFLURANE DELIVERED IN 100% OXYGEN VERSUS 21% OXYGEN.

    PubMed

    O, Odette; Churgin, Sarah M; Sladky, Kurt K; Smith, Lesley J

    2015-09-01

    Inland bearded dragons (Pogona vitticeps, n=6) were anesthetized for 1 hr using isoflurane in either 100% oxygen or 21% oxygen (FI 21; medical-grade room air). Parameters of anesthetic depth were recorded throughout both induction and recovery by an observer blinded to the fraction of inspired oxygen (FiO2), including the loss and return of withdrawal and righting reflexes, muscle tone, ability to intubate or extubate, and return to spontaneous respiration. Physiologic data were recorded every 5 min throughout the anesthetic procedures, including heart rate, body temperature, end-tidal CO2, hemoglobin oxygen saturation (SpO2), and percent expired isoflurane. Lizards were subjected to application of a noxious stimulus (needle stick) at 0, 30, and 60 min, and responses recorded. Following a minimum 7-day washout period, the experiment was repeated with each lizard subjected to the other protocol in a randomized, complete crossover design. The only statistically significant difference was a lower mean SpO2 in the group inspiring 21% oxygen (P<0.0020). No statistically significant differences were detected in any parameters during induction or recovery; however, all values were uniformly shorter for the FI 21 group, indicating a possible clinically significant difference. A larger sample size may have detected statistically significant differences. Further studies are needed to evaluate these effects in other reptile species and with the concurrent use of injectable anesthetic and analgesic drugs.

  16. Dental enamel defect diagnosis through different technology-based devices.

    PubMed

    Kobayashi, Tatiana Yuriko; Vitor, Luciana Lourenço Ribeiro; Carrara, Cleide Felício Carvalho; Silva, Thiago Cruvinel; Rios, Daniela; Machado, Maria Aparecida Andrade Moreira; Oliveira, Thais Marchini

    2018-06-01

    Dental enamel defects (DEDs) are faulty or deficient enamel formations of primary and permanent teeth. Changes during tooth development result in hypoplasia (a quantitative defect) and/or hypomineralisation (a qualitative defect). To compare technology-based diagnostic methods for detecting DEDs. Two-hundred and nine dental surfaces of anterior permanent teeth were selected in patients, 6-11 years of age, with cleft lip with/without cleft palate. First, a conventional clinical examination was conducted according to the modified Developmental Defects of Enamel Index (DDE Index). Dental surfaces were evaluated using an operating microscope and a fluorescence-based device. Interexaminer reproducibility was determined using the kappa test. To compare groups, McNemar's test was used. Cramer's V test was used for comparing the distribution of index codes obtained after classification of all dental surfaces. Cramer's V test revealed statistically significant differences (P < .0001) in the distribution of index codes obtained using the different methods; the coefficients were 0.365 for conventional clinical examination versus fluorescence, 0.961 for conventional clinical examination versus operating microscope and 0.358 for operating microscope versus fluorescence. The sensitivity of the operating microscope and fluorescence method was statistically significant (P = .008 and P < .0001, respectively). Otherwise, the results did not show statistically significant differences in accuracy and specificity for either the operating microscope or the fluorescence methods. This study suggests that the operating microscope performed better than the fluorescence-based device and could be an auxiliary method for the detection of DEDs. © 2017 FDI World Dental Federation.

  17. Seroprevalence and risk factors of Neospora spp. and Toxoplasma gondii infections among horses and donkeys in Nigeria, West Africa.

    PubMed

    Bártová, Eva; Sedlák, Kamil; Kobédová, Kateřina; Budíková, Marie; Joel Atuman, Yakubu; Kamani, Joshua

    2017-09-26

    Neospora spp. and Toxoplasma gondii are considered to be a globally distributed parasites affecting wide range of warm-blooded animals. Neosporosis has caused clinical illness in horses and consumption of horse meat has been epidemiologically linked to clinical toxoplasmosis in humans. This study was conducted to determine Neospora spp. and T. gondii antibodies and risk factors of infection in horses and donkeys from three states of Nigeria. A total of 144 samples were collected from clinically healthy animals (120 horses and 24 donkeys). The sera were tested for antibodies to Neospora spp. and T. gondii by indirect fluorescence antibody test, a titer ≥ 50 was considered positive. Seroprevalence data were statistically analyzed, considering the variables of gender, age, use, state, origin of breed and type of management. Antibodies to Neospora spp. and T. gondii were detected in 8% horses with titers 50 and in 24% horses with titers 50-800, respectively. Co-infection of both parasites was proved in three horses (3%). Statistical differences were found only for T. gondii seroprevalence in horses with different use, locality, origin and management (p-value ≤ 0.05). Antibodies to T. gondii were detected in four (17%) of 24 donkeys with statistical difference (p-value ≤ 0.05) in animals of different use; antibodies to Neospora spp. were not proved in any of the donkeys. This is the first seroprevalence study of Neospora spp. and T. gondii in equids from Nigeria.

  18. Designing image segmentation studies: Statistical power, sample size and reference standard quality.

    PubMed

    Gibson, Eli; Hu, Yipeng; Huisman, Henkjan J; Barratt, Dean C

    2017-12-01

    Segmentation algorithms are typically evaluated by comparison to an accepted reference standard. The cost of generating accurate reference standards for medical image segmentation can be substantial. Since the study cost and the likelihood of detecting a clinically meaningful difference in accuracy both depend on the size and on the quality of the study reference standard, balancing these trade-offs supports the efficient use of research resources. In this work, we derive a statistical power calculation that enables researchers to estimate the appropriate sample size to detect clinically meaningful differences in segmentation accuracy (i.e. the proportion of voxels matching the reference standard) between two algorithms. Furthermore, we derive a formula to relate reference standard errors to their effect on the sample sizes of studies using lower-quality (but potentially more affordable and practically available) reference standards. The accuracy of the derived sample size formula was estimated through Monte Carlo simulation, demonstrating, with 95% confidence, a predicted statistical power within 4% of simulated values across a range of model parameters. This corresponds to sample size errors of less than 4 subjects and errors in the detectable accuracy difference less than 0.6%. The applicability of the formula to real-world data was assessed using bootstrap resampling simulations for pairs of algorithms from the PROMISE12 prostate MR segmentation challenge data set. The model predicted the simulated power for the majority of algorithm pairs within 4% for simulated experiments using a high-quality reference standard and within 6% for simulated experiments using a low-quality reference standard. A case study, also based on the PROMISE12 data, illustrates using the formulae to evaluate whether to use a lower-quality reference standard in a prostate segmentation study. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. The impact of registration accuracy on imaging validation study design: A novel statistical power calculation.

    PubMed

    Gibson, Eli; Fenster, Aaron; Ward, Aaron D

    2013-10-01

    Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions? Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Sensitivity of direct immunofluorescence in oral diseases. Study of 125 cases.

    PubMed

    Sano, Susana Mariela; Quarracino, María Cecilia; Aguas, Silvia Cristina; González, Ernestina Jesús; Harada, Laura; Krupitzki, Hugo; Mordoh, Ana

    2008-05-01

    Direct immunofluorescence (DIF) is widely used for the diagnosis of bullous diseases and other autoimmune pathologies such as oral lichen planus. There is no evidence in the literature on how the following variants influence the detection rate of DIF: intraoral site chosen for the biopsy, perilesional locus or distant site from the clinical lesion, number of biopsies and instrument used. to determine if the following variants influenced the sensitivity (detection rate): intraoral site chosen for the biopsy, perilesional or distant site from the clinical lesion, number of biopsies and instrument used (punch or scalpel). A retrospective study was done at the Cátedra de Patología y Clínica Bucodental II at the Facultad de Odontología, Universidad de Buenos Aires; 136 clinical medical histories were revised for the period March 2000 - March 2005 corresponding to patients with clinical diagnosis of OLP and bullous diseases (vulgar pemphigus, bullous pemphigoid and cicatricial pemphigoid). DIF detection rate was 65.8% in patients with OLP, 66.7% in cicatricial pemphigoid patients, in bullous pemphigoid 55.6%, in pemphigus vulgaris 100%, and in those cases in which certain diagnosis could not be obtained, the DIF positivity rate was 45.5% (Pearson chi(2) (4)= 21.5398 Pr= 0.000). There was no statistically significant difference between the different sites of biopsy (Fisher exact test: 0.825). DIF detection rate in perilesional biopsies was 66.1% and in those distant from the site of clinical lesion was 64.7% (Pearson chi(2) v1)= 0.0073 Pr= 0.932. When the number of biopsies were incremented, DIF detection rate also incremented (Pearson chi(2) = 8.7247 Pr= 0.003). The biopsies taken with punch had a higher detection rate than those taken with scalpel (39.1% versus 71.7%) (Pearson chi(2) = 49.0522 Pr= 0.000). While not statistically significant, the tendency outlined in this study indicates there are intraoral regions in which the detection rate of the DIF technique is higher than others: mouth floor, hard palate, superior labial mucosa, ventral face of tongue. This finding could allow a choice of accessible locations and easy operator manipulation, even in distant places from the clinical lesion. Perilesional biopsies have a detection rate similar to those taken distant from the clinical lesion, and those taken with punch have a higher sensitivity rate than those taken with scalpel (both differences were statistically significant).

  1. Comparison of three different detectors applied to synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Ranney, Kenneth I.; Khatri, Hiralal; Nguyen, Lam H.

    2002-08-01

    The U.S. Army Research Laboratory has investigated the relative performance of three different target detection paradigms applied to foliage penetration (FOPEN) synthetic aperture radar (SAR) data. The three detectors - a quadratic polynomial discriminator (QPD), Bayesian neural network (BNN) and a support vector machine (SVM) - utilize a common collection of statistics (feature values) calculated from the fully polarimetric FOPEN data. We describe the parametric variations required as part of the algorithm optimizations, and we present the relative performance of the detectors in terms of probability of false alarm (Pfa) and probability of detection (Pd).

  2. Performance of RVGui sensor and Kodak Ektaspeed Plus film for proximal caries detection.

    PubMed

    Abreu, M; Mol, A; Ludlow, J B

    2001-03-01

    A high-resolution charge-coupled device was used to compare the diagnostic performances obtained with Trophy's new RVGui sensor and Kodak Ektaspeed Plus film with respect to caries detection. Three acquisition modes of the Trophy RVGui sensor were compared with Kodak Ektaspeed Plus film. Images of the proximal surfaces of 40 extracted posterior teeth were evaluated by 6 observers. The presence or absence of caries was scored by means of a 5-point confidence scale. The actual caries status of each surface was determined through ground-section histology. Responses were evaluated by means of receiver operating characteristic analysis. Areas under receiver operating characteristic curves (A(Z)) were assessed through analysis of variance. The mean A(Z) scores were 0.85 for film, 0.84 for the high-resolution caries mode, and 0.82 for both the low resolution caries mode and the high-resolution periodontal mode. These differences were not statistically significant (P =.70). The differences among observers also were not statistically significant (P =.23). The performance of the RVGui sensor in high- and low-resolution modes for proximal caries detection is comparable to that of Ektaspeed Plus film.

  3. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  4. Electrocardiography in two subspecies of manatee (Trichechus manatus latirostris and Trichechus manatus manatus)

    USGS Publications Warehouse

    Siegal-Willott, J.; Estrada, A.; Bonde, R.K.; Wong, A.; Estrada, D.J.; Harr, K.

    2006-01-01

    Electrocardiographic (ECG) measurements were recorded in two subspecies of awake, apparently healthy, wild manatees (Trichechus manatus latirostris and T. m. manatus) undergoing routine field examinations in Florida and Belize. Six unsedated juveniles (dependent and independent calves) and 6 adults were restrained in ventral recumbency for ECG measurements. Six lead ECGs were recorded for all manatees and the following parameters were determined: heart rate and rhythm; P, QRS, and T wave morphology, amplitude, and duration; and mean electrical axis (MEA). Statistical differences using a t-test for equality of means were determined. No statistical difference was seen based on sex or subspecies of manatees in the above measured criteria. Statistical differences existed in heart rate (P = 0.047), P wave duration (P = 0.019), PR interval (P = 0.025), and MEA (P = 0.021) between adult manatees and calves. Our findings revealed normal sinus rhythms, no detectable arrhythmias, prolonged PR and QT intervals, prolonged P wave duration, and small R wave amplitude as compared with cetacea and other marine mammals. This paper documents the techniques for and baseline recordings of ECGs in juvenile and adult free-living manatees. It also demonstrates that continual assessment of cardiac electrical activity in the awake manatee can be completed and can be used to aid veterinarians and biologists in routine health assessment, during procedures, and in detecting the presence of cardiac disease or dysfunction.

  5. Electrocardiography in two subspecies of manatee (Trichechus manatus latirostris and T. m. manatus).

    PubMed

    Siegal-Willott, Jessica; Estrada, Amara; Bonde, Robert; Wong, Arthur; Estrada, Daniel J; Harr, Kendal

    2006-12-01

    Electrocardiographic (ECG) measurements were recorded in two subspecies of awake, apparently healthy, wild manatees (Trichechus manatus latirostris and T. m. manatus) undergoing routine field examinations in Florida and Belize. Six unsedated juveniles (dependent and independent calves) and 6 adults were restrained in ventral recumbency for ECG measurements. Six lead ECGs were recorded for all manatees and the following parameters were determined: heart rate and rhythm; P, QRS, and T wave morphology, amplitude, and duration; and mean electrical axis (MEA). Statistical differences using a t-test for equality of means were determined. No statistical difference was seen based on sex or subspecies of manatees in the above measured criteria. Statistical differences existed in heart rate (P = 0.047), P wave duration (P = 0.019), PR interval (P = 0.025), and MEA (P = 0.021) between adult manatees and calves. Our findings revealed normal sinus rhythms, no detectable arrhythmias, prolonged PR and QT intervals, prolonged P wave duration, and small R wave amplitude as compared with cetacea and other marine mammals. This paper documents the techniques for and baseline recordings of ECGs in juvenile and adult free-living manatees. It also demonstrates that continual assessment of cardiac electrical activity in the awake manatee can be completed and can be used to aid veterinarians and biologists in routine health assessment, during procedures, and in detecting the presence of cardiac disease or dysfunction.

  6. Detection efficiency of auditory steady state evoked by modulated noise.

    PubMed

    Santos, T S; Silva, J J; Lins, O G; Melges, D B; Tierra-Criollo, C J

    2016-09-01

    This study aimed to investigate the efficiency of Magnitude Squared Coherence (MSC) and Spectral F test (SFT) for the detection of auditory steady state responses (ASSR) obtained by amplitude-modulated noises. Twenty individuals (12 women) without any history of neurological or audiological diseases, aged from 18 to 59 years (mean ± standard deviation = 26.45 ± 3.9 years), who provided written informed consent, participated in the study. The Audiostim system was used for stimulating and ASSR recording. The tested stimuli were amplitude-modulated Wide-band noise (WBN), Low-band noise (LBN), High-band noise (HBN), Two-band noise (TBN) between 77 and 110 Hz, applied in intensity levels of 55, 45, and 25 dB sound pressure level (SPL). MSC and SFT, two statistical-based detection techniques, were applied with a significance level of 5%. Detection times and rates were compared using the Friedman test and Tukey-Kramer as post hoc analysis. Also based on the stimulation parameters (stimuli types and intensity levels) and detection techniques (MSC or SFT), 16 different pass/fail protocols, for which the true negatives (TN) were calculated. The median detection times ranged from 68 to 157s for 55 dB SPL, 68-99s for 45 dB SPL, and 84-118s for 25 dB SPL. No statistical difference was found between MSC and STF considering the median detection times (p > 0.05). The detection rates ranged from 100% to 55.6% in 55 dB SPL, 97.2%-38.9% in 45 dB SPL and 66.7%-8.3% in 25 dB SPL. Also for detection rates, no statistical difference was observed between MSC and STF (p > 0.05). True negatives (TN) above 90% were found for Protocols that employed WBN or HBN, at 55 dB SPL or that used WBN or HBN, at 45 dB SPL. For Protocols employing TBN, at 55 dB SPL or 45 dB SPL TN below 60% were found due to the low detection rates of stimuli that included low-band frequencies. The stimuli that include high-frequency content showed higher detection rates (>90%) and lower detection times (<3 min). The noise composed by two bands applied separately (TBN) is not feasible for clinical applications since it requires prolonging the exam duration, and also led to a reduced percentage of true negatives. On the other hand, WBN and HBN achieved high detection performance and high TN and should be investigated to implement pass/fail protocol for hearing screening with clinical population. Finally, both WBN and HBN seemed to be indifferent to the employed technique (SFT or MSC), which can be seen as another advantage of ASSR employment. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  8. 2D versus 3D in the kinematic analysis of the horse at the trot.

    PubMed

    Miró, F; Santos, R; Garrido-Castro, J L; Galisteo, A M; Medina-Carnicer, R

    2009-08-01

    The handled trot of three Lusitano Purebred stallions was analyzed by using 2D and 3D kinematical analysis methods. Using the same capture and analysis system, 2D and 3D data of some linear (stride length, maximal height of the hoof trajectories) and angular (angular range of motion, inclination of bone segments) variables were obtained. A paired Student T-test was performed in order to detect statistically significant differences between data resulting from the two methodologies With respect to the angular variables, there were significant differences in scapula inclination, shoulder angle, cannon inclination and protraction-retraction angle in the forelimb variables, but none of them were statistically different in the hind limb. Differences between the two methods were found in most of the linear variables analyzed.

  9. Reliability of the Watch-PAT 200 in detecting sleep apnea in highway bus drivers.

    PubMed

    Yuceege, Melike; Firat, Hikmet; Demir, Ahmet; Ardic, Sadik

    2013-04-15

    To predict the validity of Watch-PAT (WP) device for sleep disordered breathing (SDB) among highway bus drivers. A total number of 90 highway bus drivers have undergone polysomnography (PSG) and Watch-PAT test simultaneously. Routine blood tests and the routine ear-nose-throat (ENT) exams have been done as well. The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 89.1%, 76.9%, 82% and 85.7% for RDI > 15, respectively. WRDI, WODI, W < 90% duration and Wmean SaO2 results were well correlated with the PSG results. In the sensitivity and specificity analysis, when diagnosis of sleep apnea was defined for different cut-off values of RDI of 5, 10 and 15, AUC (95%CI) were found as 0.84 (0.74-0.93), 0.87 (95%CI: 0.79-0.94) and 0.91 (95%CI: 0.85-0.97), respectively. There were no statistically significant differences between Stage1+2/Wlight and Stage REM/WREM. The percentage of Stage 3 sleep had difference significant statistically from the percentage of Wdeep. Total sleep times in PSG and WP showed no statistically important difference. Total NREM duration and total WNREM duration had no difference either. Watch-PAT device is helpful in detecting SDB with RDI > 15 in highway bus drivers, especially in drivers older than 45 years, but has limited value in drivers younger than 45 years old who have less risk for OSA. Therefore, WP can be used in the former group when PSG is not easily available.

  10. Advanced imaging technologies increase detection of dysplasia and neoplasia in patients with Barrett's esophagus: a meta-analysis and systematic review.

    PubMed

    Qumseya, Bashar J; Wang, Haibo; Badie, Nicole; Uzomba, Rosemary N; Parasa, Sravanthi; White, Donna L; Wolfsen, Herbert; Sharma, Prateek; Wallace, Michael B

    2013-12-01

    US guidelines recommend surveillance of patients with Barrett's esophagus (BE) to detect dysplasia. BE conventionally is monitored via white-light endoscopy (WLE) and a collection of random biopsy specimens. However, this approach does not definitively or consistently detect areas of dysplasia. Advanced imaging technologies can increase the detection of dysplasia and cancer. We investigated whether these imaging technologies can increase the diagnostic yield for the detection of neoplasia in patients with BE, compared with WLE and analysis of random biopsy specimens. We performed a systematic review, using Medline and Embase, to identify relevant peer-review studies. Fourteen studies were included in the final analysis, with a total of 843 patients. Our metameter (estimate) of interest was the paired-risk difference (RD), defined as the difference in yield of the detection of dysplasia or cancer using advanced imaging vs WLE. The estimated paired-RD and 95% confidence interval (CI) were obtained using random-effects models. Heterogeneity was assessed by means of the Q statistic and the I(2) statistic. An exploratory meta-regression was performed to look for associations between the metameter and potential confounders or modifiers. Overall, advanced imaging techniques increased the diagnostic yield for detection of dysplasia or cancer by 34% (95% CI, 20%-56%; P < .0001). A subgroup analysis showed that virtual chromoendoscopy significantly increased the diagnostic yield (RD, 0.34; 95% CI, 0.14-0.56; P < .0001). The RD for chromoendoscopy was 0.35 (95% CI, 0.13-0.56; P = .0001). There was no significant difference between virtual chromoendoscopy and chromoendoscopy, based on Student t test analysis (P = .45). Based on a meta-analysis, advanced imaging techniques such as chromoendoscopy or virtual chromoendoscopy significantly increase the diagnostic yield for identification of dysplasia or cancer in patients with BE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.

  11. Resource-constrained Data Collection and Fusion for Identifying Weak Distributed Patterns in Networks

    DTIC Science & Technology

    2013-10-15

    statistic,” in Artifical Intelligence and Statistics (AISTATS), 2013. [6] ——, “Detecting activity in graphs via the Graph Ellipsoid Scan Statistic... Artifical Intelligence and Statistics (AISTATS), 2013. [8] ——, “Near-optimal anomaly detection in graphs using Lovász Extended Scan Statistic,” in Neural...networks,” in Artificial Intelligence and Statistics (AISTATS), 2010. 11 [11] D. Aldous, “The random walk construction of uniform spanning trees and

  12. Parameters optimization defined by statistical analysis for cysteine-dextran radiolabeling with technetium tricarbonyl core.

    PubMed

    Núñez, Eutimio Gustavo Fernández; Faintuch, Bluma Linkowski; Teodoro, Rodrigo; Wiecek, Danielle Pereira; da Silva, Natanael Gomes; Papadopoulos, Minas; Pelecanou, Maria; Pirmettis, Ioannis; de Oliveira Filho, Renato Santos; Duatti, Adriano; Pasqualini, Roberto

    2011-04-01

    The objective of this study was the development of a statistical approach for radiolabeling optimization of cysteine-dextran conjugates with Tc-99m tricarbonyl core. This strategy has been applied to the labeling of 2-propylene-S-cysteine-dextran in the attempt to prepare a new class of tracers for sentinel lymph node detection, and can be extended to other radiopharmaceuticals for different targets. The statistical routine was based on three-level factorial design. Best labeling conditions were achieved. The specific activity reached was 5 MBq/μg. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  13. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises

    PubMed Central

    Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise. PMID:28692667

  14. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises.

    PubMed

    Jin, Qiyu; Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise.

  15. Evaluation of hemifield sector analysis protocol in multifocal visual evoked potential objective perimetry for the diagnosis and early detection of glaucomatous field defects.

    PubMed

    Mousa, Mohammad F; Cubbidge, Robert P; Al-Mansouri, Fatima; Bener, Abdulbari

    2014-02-01

    Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

  16. Evaluation of Hemifield Sector Analysis Protocol in Multifocal Visual Evoked Potential Objective Perimetry for the Diagnosis and Early Detection of Glaucomatous Field Defects

    PubMed Central

    Mousa, Mohammad F.; Cubbidge, Robert P.; Al-Mansouri, Fatima

    2014-01-01

    Purpose Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Methods Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Results Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. Conclusions The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PMID:24511212

  17. A self-adaptive algorithm for traffic sign detection in motion image based on color and shape features

    NASA Astrophysics Data System (ADS)

    Zhang, Ka; Sheng, Yehua; Gong, Zhijun; Ye, Chun; Li, Yongqiang; Liang, Cheng

    2007-06-01

    As an important sub-system in intelligent transportation system (ITS), the detection and recognition of traffic signs from mobile images is becoming one of the hot spots in the international research field of ITS. Considering the problem of traffic sign automatic detection in motion images, a new self-adaptive algorithm for traffic sign detection based on color and shape features is proposed in this paper. Firstly, global statistical color features of different images are computed based on statistics theory. Secondly, some self-adaptive thresholds and special segmentation rules for image segmentation are designed according to these global color features. Then, for red, yellow and blue traffic signs, the color image is segmented to three binary images by these thresholds and rules. Thirdly, if the number of white pixels in the segmented binary image exceeds the filtering threshold, the binary image should be further filtered. Fourthly, the method of gray-value projection is used to confirm top, bottom, left and right boundaries for candidate regions of traffic signs in the segmented binary image. Lastly, if the shape feature of candidate region satisfies the need of real traffic sign, this candidate region is confirmed as the detected traffic sign region. The new algorithm is applied to actual motion images of natural scenes taken by a CCD camera of the mobile photogrammetry system in Nanjing at different time. The experimental results show that the algorithm is not only simple, robust and more adaptive to natural scene images, but also reliable and high-speed on real traffic sign detection.

  18. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  19. Detection of Nonverbal Synchronization through Phase Difference in Human Communication

    PubMed Central

    Kwon, Jinhwan; Ogawa, Ken-ichiro; Ono, Eisuke; Miyake, Yoshihiro

    2015-01-01

    Nonverbal communication is an important factor in human communication, and body movement synchronization in particular is an important part of nonverbal communication. Some researchers have analyzed body movement synchronization by focusing on changes in the amplitude of body movements. However, the definition of “body movement synchronization” is still unclear. From a theoretical viewpoint, phase difference is the most important factor in synchronization analysis. Therefore, there is a need to measure the synchronization of body movements using phase difference. The purpose of this study was to provide a quantitative definition of the phase difference distribution for detecting body movement synchronization in human communication. The phase difference distribution was characterized using four statistical measurements: density, mean phase difference, standard deviation (SD) and kurtosis. To confirm the effectiveness of our definition, we applied it to human communication in which the roles of speaker and listener were defined. Specifically, we examined the difference in the phase difference distribution between two different communication situations: face-to-face communication with visual interaction and remote communication with unidirectional visual perception. Participant pairs performed a task supposing lecture in the face-to-face communication condition and in the remote communication condition via television. Throughout the lecture task, we extracted a set of phase differences from the time-series data of the acceleration norm of head nodding motions between two participants. Statistical analyses of the phase difference distribution revealed the characteristics of head nodding synchronization. Although the mean phase differences in synchronized head nods did not differ significantly between the conditions, there were significant differences in the densities, the SDs and the kurtoses of the phase difference distributions of synchronized head nods. These results show the difference in nonverbal synchronization between different communication types. Our study indicates that the phase difference distribution is useful in detecting nonverbal synchronization in various human communication situations. PMID:26208100

  20. Detection of Nonverbal Synchronization through Phase Difference in Human Communication.

    PubMed

    Kwon, Jinhwan; Ogawa, Ken-ichiro; Ono, Eisuke; Miyake, Yoshihiro

    2015-01-01

    Nonverbal communication is an important factor in human communication, and body movement synchronization in particular is an important part of nonverbal communication. Some researchers have analyzed body movement synchronization by focusing on changes in the amplitude of body movements. However, the definition of "body movement synchronization" is still unclear. From a theoretical viewpoint, phase difference is the most important factor in synchronization analysis. Therefore, there is a need to measure the synchronization of body movements using phase difference. The purpose of this study was to provide a quantitative definition of the phase difference distribution for detecting body movement synchronization in human communication. The phase difference distribution was characterized using four statistical measurements: density, mean phase difference, standard deviation (SD) and kurtosis. To confirm the effectiveness of our definition, we applied it to human communication in which the roles of speaker and listener were defined. Specifically, we examined the difference in the phase difference distribution between two different communication situations: face-to-face communication with visual interaction and remote communication with unidirectional visual perception. Participant pairs performed a task supposing lecture in the face-to-face communication condition and in the remote communication condition via television. Throughout the lecture task, we extracted a set of phase differences from the time-series data of the acceleration norm of head nodding motions between two participants. Statistical analyses of the phase difference distribution revealed the characteristics of head nodding synchronization. Although the mean phase differences in synchronized head nods did not differ significantly between the conditions, there were significant differences in the densities, the SDs and the kurtoses of the phase difference distributions of synchronized head nods. These results show the difference in nonverbal synchronization between different communication types. Our study indicates that the phase difference distribution is useful in detecting nonverbal synchronization in various human communication situations.

  1. Comparison between nasopharyngeal swab and nasal wash, using culture and PCR, in the detection of potential respiratory pathogens.

    PubMed

    Gritzfeld, Jenna F; Roberts, Paul; Roche, Lorna; El Batrawy, Sherouk; Gordon, Stephen B

    2011-04-13

    Nasopharyngeal carriage of potential pathogens is important as it is both the major source of transmission and the prerequisite of invasive disease. New methods for detecting carriage could improve comfort, accuracy and laboratory utility. The aims of this study were to compare the sensitivities of a nasopharyngeal swab (NPS) and a nasal wash (NW) in detecting potential respiratory pathogens in healthy adults using microbiological culture and PCR. Healthy volunteers attended for nasal washing and brushing of the posterior nasopharynx. Conventional and real-time PCR were used to detect pneumococcus and meningococcus. Statistical differences between the two nasal sampling methods were determined using a nonparametric Mann-Whitney U test; differences between culture and PCR methods were determined using the McNemar test.Nasal washing was more comfortable for volunteers than swabbing (n = 24). In detection by culture, the NW was significantly more likely to detect pathogens than the NPS (p < 0.00001). Overall, there was a low carriage rate of pathogens in this sample; no significant difference was seen in the detection of bacteria between culture and PCR methods. Nasal washing and PCR may provide effective alternatives to nasopharyngeal swabbing and classical microbiology, respectively.

  2. DETECTING BENTHIC COMMUNITY DIFFERENCES: INFLUENCE OF STATISTICAL INDEX AND SEASON

    EPA Science Inventory

    An accurate assessment of estuarine condition is critical to determining whether there has been a change from baseline or 'natural' conditions; benthic communities are routinely used as an ecological endpoint to make this assessment. We addressed two issues which arise when attem...

  3. Detecting Latent Heterogeneity

    ERIC Educational Resources Information Center

    Pearl, Judea

    2017-01-01

    We address the task of determining, from statistical averages alone, whether a population under study consists of several subpopulations, unknown to the investigator, each responding to a given treatment markedly differently. We show that such determination is feasible in three cases: (1) randomized trials with binary treatments, (2) models where…

  4. Towards Enhanced Underwater Lidar Detection via Source Separation

    NASA Astrophysics Data System (ADS)

    Illig, David W.

    Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical separation: The first investigations of statistical separation approaches for underwater lidar are presented. By demonstrating that target and backscatter returns have different statistical properties, a new separation axis is opened. This work investigates and quantifies performance of three statistical separation approaches. 4. Application of detection theory to underwater lidar: While many similar applications use detection theory to assess performance, less development has occurred in the underwater lidar field. This work applies these concepts to statistical separation approaches, providing another perspective in which to assess performance. In addition, by using detection theory approaches, statistical metrics can be used to associate a level of confidence in each ranging measurement. 5. Preliminary investigation of forward scatter suppression: If backscatter is sufficiently suppressed, forward scattering becomes a performance-limiting factor. This work presents a proof-of-concept demonstration of the potential for statistical separation approaches to suppress both forward and backward scatter. These results provide a demonstration of the capability that signal processing has to improve separation between target and backscatter. Separation capability improves in the transition from temporal to frequency to statistical separation approaches, with the statistical separation approaches improving target detection sensitivity by as much as 30 dB. Ranging and detection results demonstrate the enhanced performance this would allow in ranging applications. This increased performance is an important step in moving underwater lidar capability towards the requirements of the next generation of sensors.

  5. Fusion of Local Statistical Parameters for Buried Underwater Mine Detection in Sonar Imaging

    NASA Astrophysics Data System (ADS)

    Maussang, F.; Rombaut, M.; Chanussot, J.; Hétet, A.; Amate, M.

    2008-12-01

    Detection of buried underwater objects, and especially mines, is a current crucial strategic task. Images provided by sonar systems allowing to penetrate in the sea floor, such as the synthetic aperture sonars (SASs), are of great interest for the detection and classification of such objects. However, the signal-to-noise ratio is fairly low and advanced information processing is required for a correct and reliable detection of the echoes generated by the objects. The detection method proposed in this paper is based on a data-fusion architecture using the belief theory. The input data of this architecture are local statistical characteristics extracted from SAS data corresponding to the first-, second-, third-, and fourth-order statistical properties of the sonar images, respectively. The interest of these parameters is derived from a statistical model of the sonar data. Numerical criteria are also proposed to estimate the detection performances and to validate the method.

  6. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic

    PubMed Central

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364

  7. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.

    PubMed

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.

  8. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  9. Supervised target detection in hyperspectral images using one-class Fukunaga-Koontz Transform

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah

    2016-05-01

    A novel hyperspectral target detection technique based on Fukunaga-Koontz transform (FKT) is presented. FKT offers significant properties for feature selection and ordering. However, it can only be used to solve multi-pattern classification problems. Target detection may be considered as a two-class classification problem, i.e., target versus background clutter. Nevertheless, background clutter typically contains different types of materials. That's why; target detection techniques are different than classification methods by way of modeling clutter. To avoid the modeling of the background clutter, we have improved one-class FKT (OC-FKT) for target detection. The statistical properties of target training samples are used to define tunnel-like boundary of the target class. Non-target samples are then created synthetically as to be outside of the boundary. Thus, only limited target samples become adequate for training of FKT. The hyperspectral image experiments confirm that the proposed OC-FKT technique provides an effective means for target detection.

  10. It's all relative: ranking the diversity of aquatic bacterial communities.

    PubMed

    Shaw, Allison K; Halpern, Aaron L; Beeson, Karen; Tran, Bao; Venter, J Craig; Martiny, Jennifer B H

    2008-09-01

    The study of microbial diversity patterns is hampered by the enormous diversity of microbial communities and the lack of resources to sample them exhaustively. For many questions about richness and evenness, however, one only needs to know the relative order of diversity among samples rather than total diversity. We used 16S libraries from the Global Ocean Survey to investigate the ability of 10 diversity statistics (including rarefaction, non-parametric, parametric, curve extrapolation and diversity indices) to assess the relative diversity of six aquatic bacterial communities. Overall, we found that the statistics yielded remarkably similar rankings of the samples for a given sequence similarity cut-off. This correspondence, despite the different underlying assumptions of the statistics, suggests that diversity statistics are a useful tool for ranking samples of microbial diversity. In addition, sequence similarity cut-off influenced the diversity ranking of the samples, demonstrating that diversity statistics can also be used to detect differences in phylogenetic structure among microbial communities. Finally, a subsampling analysis suggests that further sequencing from these particular clone libraries would not have substantially changed the richness rankings of the samples.

  11. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  12. Separate modal analysis for tumor detection with a digital image elasto tomography (DIET) breast cancer screening system.

    PubMed

    Kashif, Amer S; Lotz, Thomas F; Heeren, Adrianus M W; Chase, James G

    2013-11-01

    It is estimated that every year, 1 × 10(6) women are diagnosed with breast cancer, and more than 410,000 die annually worldwide. Digital Image Elasto Tomography (DIET) is a new noninvasive breast cancer screening modality that induces mechanical vibrations in the breast and images its surface motion with digital cameras to detect changes in stiffness. This research develops a new automated approach for diagnosing breast cancer using DIET based on a modal analysis model. The first and second natural frequency of silicone phantom breasts is analyzed. Separate modal analysis is performed for each region of the phantom to estimate the modal parameters using imaged motion data over several input frequencies. Statistical methods are used to assess the likelihood of a frequency shift, which can indicate tumor location. Phantoms with 5, 10, and 20 mm stiff inclusions are tested, as well as a homogeneous (healthy) phantom. Inclusions are located at four locations with different depth. The second natural frequency proves to be a reliable metric with the potential to clearly distinguish lesion like inclusions of different stiffness, as well as providing an approximate location for the tumor like inclusions. The 10 and 20 mm inclusions are always detected regardless of depth. The 5 mm inclusions are only detected near the surface. The homogeneous phantom always yields a negative result, as expected. Detection is based on a statistical likelihood analysis to determine the presence of significantly different frequency response over the phantom, which is a novel approach to this problem. The overall results show promise and justify proof of concept trials with human subjects.

  13. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification.

    PubMed

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67-0.89) compared to L-ASIR or UL-ASIR (0.11-0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818-0.860) was comparable to that for L-ASIR (0.696-0.844). The specificity was lower with UL-MBIR (0.79-0.92) than with L-ASIR or UL-ASIR (0.96-0.99), and a significant difference was seen for one reader (P < 0.01). In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity.

  14. A Practical Guide to Check the Consistency of Item Response Patterns in Clinical Research Through Person-Fit Statistics: Examples and a Computer Program.

    PubMed

    Meijer, Rob R; Niessen, A Susan M; Tendeiro, Jorge N

    2016-02-01

    Although there are many studies devoted to person-fit statistics to detect inconsistent item score patterns, most studies are difficult to understand for nonspecialists. The aim of this tutorial is to explain the principles of these statistics for researchers and clinicians who are interested in applying these statistics. In particular, we first explain how invalid test scores can be detected using person-fit statistics; second, we provide the reader practical examples of existing studies that used person-fit statistics to detect and to interpret inconsistent item score patterns; and third, we discuss a new R-package that can be used to identify and interpret inconsistent score patterns. © The Author(s) 2015.

  15. Initial study of Schroedinger eigenmaps for spectral target detection

    NASA Astrophysics Data System (ADS)

    Dorado-Munoz, Leidy P.; Messinger, David W.

    2016-08-01

    Spectral target detection refers to the process of searching for a specific material with a known spectrum over a large area containing materials with different spectral signatures. Traditional target detection methods in hyperspectral imagery (HSI) require assuming the data fit some statistical or geometric models and based on the model, to estimate parameters for defining a hypothesis test, where one class (i.e., target class) is chosen over the other classes (i.e., background class). Nonlinear manifold learning methods such as Laplacian eigenmaps (LE) have extensively shown their potential use in HSI processing, specifically in classification or segmentation. Recently, Schroedinger eigenmaps (SE), which is built upon LE, has been introduced as a semisupervised classification method. In SE, the former Laplacian operator is replaced by the Schroedinger operator. The Schroedinger operator includes by definition, a potential term V that steers the transformation in certain directions improving the separability between classes. In this regard, we propose a methodology for target detection that is not based on the traditional schemes and that does not need the estimation of statistical or geometric parameters. This method is based on SE, where the potential term V is taken into consideration to include the prior knowledge about the target class and use it to steer the transformation in directions where the target location in the new space is known and the separability between target and background is augmented. An initial study of how SE can be used in a target detection scheme for HSI is shown here. In-scene pixel and spectral signature detection approaches are presented. The HSI data used comprise various target panels for testing simultaneous detection of multiple objects with different complexities.

  16. Thermal detection thresholds in 5-year-old preterm born children; IQ does matter.

    PubMed

    de Graaf, Joke; Valkenburg, Abraham J; Tibboel, Dick; van Dijk, Monique

    2012-07-01

    Experiencing pain at newborn age may have consequences on one's somatosensory perception later in life. Children's perception for cold and warm stimuli may be determined with the Thermal Sensory Analyzer (TSA) device by two different methods. This pilot study in 5-year-old children born preterm aimed at establishing whether the TSA method of limits, which is dependent of reaction time, and the method of levels, which is independent of reaction time, would yield different cold and warm detection thresholds. The second aim was to establish possible associations between intellectual ability and the detection thresholds obtained with either method. A convenience sample was drawn from the participants in an ongoing 5-year follow-up study of a randomized controlled trial on effects of morphine during mechanical ventilation. Thresholds were assessed using both methods and statistically compared. Possible associations between the child's intelligence quotient (IQ) and threshold levels were analyzed. The method of levels yielded more sensitive thresholds than did the method of limits, i.e. mean (SD) cold detection thresholds: 30.3 (1.4) versus 28.4 (1.7) (Cohen'sd=1.2, P=0.001) and warm detection thresholds; 33.9 (1.9) versus 35.6 (2.1) (Cohen's d=0.8, P=0.04). IQ was statistically significantly associated only with the detection thresholds obtained with the method of limits (cold: r=0.64, warm: r=-0.52). The TSA method of levels, is to be preferred over the method of limits in 5-year-old preterm born children, as it establishes more sensitive detection thresholds and is independent of IQ. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Change detection using landsat time series: A review of frequencies, preprocessing, algorithms, and applications

    NASA Astrophysics Data System (ADS)

    Zhu, Zhe

    2017-08-01

    The free and open access to all archived Landsat images in 2008 has completely changed the way of using Landsat data. Many novel change detection algorithms based on Landsat time series have been developed We present a comprehensive review of four important aspects of change detection studies based on Landsat time series, including frequencies, preprocessing, algorithms, and applications. We observed the trend that the more recent the study, the higher the frequency of Landsat time series used. We reviewed a series of image preprocessing steps, including atmospheric correction, cloud and cloud shadow detection, and composite/fusion/metrics techniques. We divided all change detection algorithms into six categories, including thresholding, differencing, segmentation, trajectory classification, statistical boundary, and regression. Within each category, six major characteristics of different algorithms, such as frequency, change index, univariate/multivariate, online/offline, abrupt/gradual change, and sub-pixel/pixel/spatial were analyzed. Moreover, some of the widely-used change detection algorithms were also discussed. Finally, we reviewed different change detection applications by dividing these applications into two categories, change target and change agent detection.

  18. Climate driven variability and detectability of temporal trends in low flow indicators for Ireland

    NASA Astrophysics Data System (ADS)

    Hall, Julia; Murphy, Conor; Harrigan, Shaun

    2013-04-01

    Observational data from hydrological monitoring programs plays an important role in informing decision makers of changes in key hydrological variables. To analyse how changes in climate influence stream flow, undisturbed river basins with near-natural conditions limited from human influences are needed. This study analyses low flow indicators derived from observations from the Irish Reference Network. Within the trend analysis approach the influence of individual years or sub-periods on the detected trend are analysed using sequential trend tests on all possible periods (of at least 10 years in length) by varying the start and end dates of records for various indicators. Results from this study highlight that the current standard approach using fixed periods to determine long term trends is not appropriate as statistical significance and direction of trends from short term records do not persist continuously over entire record and can be heavily influenced by extremes within the record. The importance of longer records in contextualising short term trends derived from fixed-periods influenced by natural annual, inter-annual and multi-decadal variability is highlighted. Due to the low signal (trend) to noise (variability) ratio, the apparent trends derived from the low flow indicators cannot be used as confident guides to inform future water resources planning and decision making on climate change. Infact, some derived trends contradict expected climate change impacts and even small changes in study design can change the outcomes to a high degree. Therefore it is important not only to evaluate the magnitude of trends derived from monitoring data but also when a trend of a certain magnitude in a given indicator will be detectable to inform decision making or what changes might be required to detect trends for a certain significance level. In this study, the influence of observed variance in the monitoring records on the expected detection times for trends with a fixed magnitude are presented. Depending on the indicator selected, the sample variance and trend magnitude very different detection time estimates are obtained and in most cases not within the time required for anticipatory adaptation in the water resources sector. Additionally, the minimum changes in low flow indicators required to be detectable are large and changes are unlikely to be statistically detectable for many years. This means that water management and planning for anticipated future climatic changes will be required to take place without these changes being formally statistically detectable.Waiting for these trends to become formally detectable with the traditional statistical methods might not be an option for water resources management. Within the monitoring network, a considerable difference is apparent between stations in terms of detection times and changes required for detection. The existence of flow monitoring stations showing short detection times for specific indicators confirms the potential for identifying stations that may be first responders to climate induced changes. Identifying sentinel stations can increase the ability to more effectively optimise the deployment of resources for monitoring the influences of climatic change in a hydrometric reference network.

  19. Electrophysiological evidence of heterogeneity in visual statistical learning in young children with ASD.

    PubMed

    Jeste, Shafali S; Kirkham, Natasha; Senturk, Damla; Hasenstab, Kyle; Sugar, Catherine; Kupelian, Chloe; Baker, Elizabeth; Sanders, Andrew J; Shimizu, Christina; Norona, Amanda; Paparella, Tanya; Freeman, Stephanny F N; Johnson, Scott P

    2015-01-01

    Statistical learning is characterized by detection of regularities in one's environment without an awareness or intention to learn, and it may play a critical role in language and social behavior. Accordingly, in this study we investigated the electrophysiological correlates of visual statistical learning in young children with autism spectrum disorder (ASD) using an event-related potential shape learning paradigm, and we examined the relation between visual statistical learning and cognitive function. Compared to typically developing (TD) controls, the ASD group as a whole showed reduced evidence of learning as defined by N1 (early visual discrimination) and P300 (attention to novelty) components. Upon further analysis, in the ASD group there was a positive correlation between N1 amplitude difference and non-verbal IQ, and a positive correlation between P300 amplitude difference and adaptive social function. Children with ASD and a high non-verbal IQ and high adaptive social function demonstrated a distinctive pattern of learning. This is the first study to identify electrophysiological markers of visual statistical learning in children with ASD. Through this work we have demonstrated heterogeneity in statistical learning in ASD that maps onto non-verbal cognition and adaptive social function. © 2014 John Wiley & Sons Ltd.

  20. An ANOVA approach for statistical comparisons of brain networks.

    PubMed

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  1. Defect detection of castings in radiography images using a robust statistical feature.

    PubMed

    Zhao, Xinyue; He, Zaixing; Zhang, Shuyou

    2014-01-01

    One of the most commonly used optical methods for defect detection is radiographic inspection. Compared with methods that extract defects directly from the radiography image, model-based methods deal with the case of an object with complex structure well. However, detection of small low-contrast defects in nonuniformly illuminated images is still a major challenge for them. In this paper, we present a new method based on the grayscale arranging pairs (GAP) feature to detect casting defects in radiography images automatically. First, a model is built using pixel pairs with a stable intensity relationship based on the GAP feature from previously acquired images. Second, defects can be extracted by comparing the difference of intensity-difference signs between the input image and the model statistically. The robustness of the proposed method to noise and illumination variations has been verified on casting radioscopic images with defects. The experimental results showed that the average computation time of the proposed method in the testing stage is 28 ms per image on a computer with a Pentium Core 2 Duo 3.00 GHz processor. For the comparison, we also evaluated the performance of the proposed method as well as that of the mixture-of-Gaussian-based and crossing line profile methods. The proposed method achieved 2.7% and 2.0% false negative rates in the noise and illumination variation experiments, respectively.

  2. The effect of exercise on venous gas emboli and decompression sickness in human subjects at 4.3 psia

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Waligora, James M.; Horrigan, David J., Jr.; Hadley, Arthur T., III

    1987-01-01

    The contribution of upper body exercise to altitude decompression sickness while at 4.3 psia after 3.5 or 4.0 hours of 100% oxygen prebreathing at 14.7 psia was determined by comparing the incidence and patterns of venous gas emboli (VGE), and the incidence of Type 1 decompression sickness (DCS) in 43 exercising male subjects and 9 less active male Doppler Technicians (DT's). Each subject exercised for 4 minutes at each of 3 exercise stations while at 4.3 psia. An additional 4 minutes were spent monitoring for VGE by the DT while the subject was supine on an examination cot. In the combined 3.5 and 4.0 hour oxygen prebreathe data, 13 subjects complained of Type 1 DCS compared to 9 complaints from DT's. VGE were detected in 28 subjects compared to 14 detections from DT's. A chi-square analysis of proportions showed no statistically significantly difference in the incidence of Type 1 DCS or VGE between the two groups; however, the average time to detect VGE and to report Tyep 1 DCS symptoms were statistically different. It was concluded that 4 to 6 hours of upper body exercise at metabolic rates simulating EVA metabolic rates hastens the initial detection of VGE and the time to report Type 1 DCS symptoms as compared to DT's.

  3. Sequential analysis as a tool for detection of amikacin ototoxicity in the treatment of multidrug-resistant tuberculosis.

    PubMed

    Vasconcelos, Karla Anacleto de; Frota, Silvana Maria Monte Coelho; Ruffino-Netto, Antonio; Kritski, Afrânio Lineu

    2018-04-01

    To investigate early detection of amikacin-induced ototoxicity in a population treated for multidrug-resistant tuberculosis (MDR-TB), by means of three different tests: pure-tone audiometry (PTA); high-frequency audiometry (HFA); and distortion-product otoacoustic emission (DPOAE) testing. This was a longitudinal prospective cohort study involving patients aged 18-69 years with a diagnosis of MDR-TB who had to receive amikacin for six months as part of their antituberculosis drug regimen for the first time. Hearing was assessed before treatment initiation and at two and six months after treatment initiation. Sequential statistics were used to analyze the results. We included 61 patients, but the final population consisted of 10 patients (7 men and 3 women) because of sequential analysis. Comparison of the test results obtained at two and six months after treatment initiation with those obtained at baseline revealed that HFA at two months and PTA at six months detected hearing threshold shifts consistent with ototoxicity. However, DPOAE testing did not detect such shifts. The statistical method used in this study makes it possible to conclude that, over the six-month period, amikacin-associated hearing threshold shifts were detected by HFA and PTA, and that DPOAE testing was not efficient in detecting such shifts.

  4. Phenobarbital reduces EEG amplitude and propagation of neonatal seizures but does not alter performance of automated seizure detection.

    PubMed

    Mathieson, Sean R; Livingstone, Vicki; Low, Evonne; Pressler, Ronit; Rennie, Janet M; Boylan, Geraldine B

    2016-10-01

    Phenobarbital increases electroclinical uncoupling and our preliminary observations suggest it may also affect electrographic seizure morphology. This may alter the performance of a novel seizure detection algorithm (SDA) developed by our group. The objectives of this study were to compare the morphology of seizures before and after phenobarbital administration in neonates and to determine the effect of any changes on automated seizure detection rates. The EEGs of 18 term neonates with seizures both pre- and post-phenobarbital (524 seizures) administration were studied. Ten features of seizures were manually quantified and summary measures for each neonate were statistically compared between pre- and post-phenobarbital seizures. SDA seizure detection rates were also compared. Post-phenobarbital seizures showed significantly lower amplitude (p<0.001) and involved fewer EEG channels at the peak of seizure (p<0.05). No other features or SDA detection rates showed a statistical difference. These findings show that phenobarbital reduces both the amplitude and propagation of seizures which may help to explain electroclinical uncoupling of seizures. The seizure detection rate of the algorithm was unaffected by these changes. The results suggest that users should not need to adjust the SDA sensitivity threshold after phenobarbital administration. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Detectability of radiological images: the influence of anatomical noise

    NASA Astrophysics Data System (ADS)

    Bochud, Francois O.; Verdun, Francis R.; Hessler, Christian; Valley, Jean-Francois

    1995-04-01

    Radiological image quality can be objectively quantified by the statistical decision theory. This theory is commonly applied with the noise of the imaging system alone (quantum, screen and film noises) whereas the actual noise present on the image is the 'anatomical noise' (sum of the system noise and the anatomical texture). This anatomical texture should play a role in the detection task. This paper compares these two kinds of noises by performing 2AFC experiments and computing the area under the ROC-curve. It is shown that the 'anatomical noise' cannot be considered as a noise in the sense of Wiener spectrum approach and that the detectability performance is the same as the one obtained with the system noise alone in the case of a small object to be detected. Furthermore, the statistical decision theory and the non- prewhitening observer does not match the experimental results. This is especially the case in the low contrast values for which the theory predicts an increase of the detectability as soon as the contrast is different from zero whereas the experimental result demonstrates an offset of the contrast value below which the detectability is purely random. The theory therefore needs to be improved in order to take this result into account.

  6. [The Battelle developmental inventory screening test for early detection of developmental disorders in cerebral palsy].

    PubMed

    Moraleda-Barreno, E; Romero-López, M; Cayetano-Menéndez, M J

    2011-12-01

    Cerebral palsy is usually associated with motor, cognitive, and language deficits, and with other disorders that cause disability in daily living skills, personal independence, social interaction and academic activities. Early detection of these deficits in the clinical setting is essential to anticipate and provide the child with the necessary support for adapting to the environment in all possible areas. The main objective of this study is to demonstrate that these deficits can be detected at an early age and comprehensively through the use of a brief development scale. We studied 100 children between 4 and 70 months old, half of them with cerebral palsy and the other half without any disorder. All subjects were evaluated using the Battelle Developmental Inventory screening test. We compared the developmental quotients in both groups and between the subjects with different motor impairments, using a simple prospective ex post facto design. The test detected statistically significant differences between the clinical group and the control group at all age levels. Statistically significant differences were also found between tetraplegia and other motor disorders. There were no differences by gender. The deficit in development associated with cerebral palsy can be quantified at early ages through the use of a brief development scale, thus we propose that the systematic implementation of protocols with this screening tool would be helpful for treatment and early intervention. This would also help in anticipating and establishing the means for the multidisciplinary actions required, and could provide guidance to other health professionals, to provide adequate school, social, and family support,. Copyright © 2011 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  7. Diagnostic accuracy of hepatorenal index in the detection and grading of hepatic steatosis.

    PubMed

    Chauhan, Anil; Sultan, Laith R; Furth, Emma E; Jones, Lisa P; Khungar, Vandana; Sehgal, Chandra M

    2016-11-12

    The objectives of our study were to assess the accuracy of hepatorenal index (HRI) in detection and grading of hepatic steatosis and to evaluate various factors that can affect the HRI measurement. Forty-five patients, who had undergone an abdominal sonographic examination within 30 days of liver biopsy, were enrolled. The HRI was calculated as the ratio of the mean brightness levels of the liver and renal parenchymas. The effect of the measurement technique on the HRI was evaluated by using various sizes, depths, and locations of the regions of interest (ROIs) in the liver. The measurements were obtained by two observers. The HRI was compared with the subjective grading of steatosis. The optimal HRI cutoff to detect steatosis was 2.01, yielding a sensitivity of 62.5% and specificity of 95.2%. Subjective grading had a sensitivity of 87.5% and specificity of 62.5%. HRIs of the hepatic steatosis group were statistically different from the no-steatosis group (p < 0.05). However, there was no statistically significant difference between mild steatosis and no-steatosis groups (p value = 0.72). There was a strong correlation between different HRIs based on variable placements of ROIs, except when the ROIs were positioned randomly. Interclass correlation coefficient for measurements performed by two observers was 0.74 (confidence interval: 0.58-0.86). The HRI is an effective tool for detecting hepatic steatosis. It provides similar accuracy for different methods of ROI placement (except for random placement) and has good interobserver agreement. It, however, is unable to effectively differentiate between absent and mild steatosis. © 2016 Wiley Periodicals, Inc. J Clin Ultrasound 44:580-586, 2016. © 2016 Wiley Periodicals, Inc.

  8. The spectral changes of deforestation in the Brazilian tropical savanna.

    PubMed

    Trancoso, Ralph; Sano, Edson E; Meneses, Paulo R

    2015-01-01

    The Cerrado is a biome in Brazil that is experiencing the most rapid loss in natural vegetation. The objective of this study was to analyze the changes in the spectral response in the red, near infrared (NIR), middle infrared (MIR), and normalized difference vegetation index (NDVI) when native vegetation in the Cerrado is deforested. The test sites were regions of the Cerrado located in the states of Bahia, Minas Gerais, and Mato Grosso. For each region, a pair of Landsat Thematic Mapper (TM) scenes from 2008 (before deforestation) and 2009 (after deforestation) was compared. A set of 1,380 samples of deforested polygons and an equal number of samples of native vegetation have their spectral properties statistically analyzed. The accuracy of deforestation detections was also evaluated using high spatial resolution imagery. Results showed that the spectral data of deforested areas and their corresponding native vegetation were statistically different. The red band showed the highest difference between the reflectance data from deforested areas and native vegetation, while the NIR band showed the lowest difference. A consistent pattern of spectral change when native vegetation in the Cerrado is deforested was identified regardless of the location in the biome. The overall accuracy of deforestation detections was 97.75%. Considering both the marked pattern of spectral changes and the high deforestation detection accuracy, this study suggests that deforestation in Cerrado can be accurately monitored, but a strong seasonal and spatial variability of spectral changes might be expected.

  9. Detection of Test Collusion via Kullback-Leibler Divergence

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2013-01-01

    The development of statistical methods for detecting test collusion is a new research direction in the area of test security. Test collusion may be described as large-scale sharing of test materials, including answers to test items. Current methods of detecting test collusion are based on statistics also used in answer-copying detection.…

  10. Detection of ochratoxin A contamination in stored wheat using near-infrared hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Senthilkumar, T.; Jayas, D. S.; White, N. D. G.; Fields, P. G.; Gräfenhan, T.

    2017-03-01

    Near-infrared (NIR) hyperspectral imaging system was used to detect five concentration levels of ochratoxin A (OTA) in contaminated wheat kernels. The wheat kernels artificially inoculated with two different OTA producing Penicillium verrucosum strains, two different non-toxigenic P. verrucosum strains, and sterile control wheat kernels were subjected to NIR hyperspectral imaging. The acquired three-dimensional data were reshaped into readable two-dimensional data. Principal Component Analysis (PCA) was applied to the two dimensional data to identify the key wavelengths which had greater significance in detecting OTA contamination in wheat. Statistical and histogram features extracted at the key wavelengths were used in the linear, quadratic and Mahalanobis statistical discriminant models to differentiate between sterile control, five concentration levels of OTA contamination in wheat kernels, and five infection levels of non-OTA producing P. verrucosum inoculated wheat kernels. The classification models differentiated sterile control samples from OTA contaminated wheat kernels and non-OTA producing P. verrucosum inoculated wheat kernels with a 100% accuracy. The classification models also differentiated between five concentration levels of OTA contaminated wheat kernels and between five infection levels of non-OTA producing P. verrucosum inoculated wheat kernels with a correct classification of more than 98%. The non-OTA producing P. verrucosum inoculated wheat kernels and OTA contaminated wheat kernels subjected to hyperspectral imaging provided different spectral patterns.

  11. Model-based iterative reconstruction in low-dose CT colonography-feasibility study in 65 patients for symptomatic investigation.

    PubMed

    Vardhanabhuti, Varut; James, Julia; Nensey, Rehaan; Hyde, Christopher; Roobottom, Carl

    2015-05-01

    To compare image quality on computed tomographic colonography (CTC) acquired at standard dose (STD) and low dose (LD) using filtered-back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) techniques. A total of 65 symptomatic patients were prospectively enrolled for the study and underwent STD and LD CTC with filtered-back projection, adaptive statistical iterative reconstruction, and MBIR to allow direct per-patient comparison. Objective image noise, subjective image analyses, and polyp detection were assessed. Objective image noise analysis demonstrates significant noise reduction using MBIR technique (P < .05) despite being acquired at lower doses. Subjective image analyses were superior for LD MBIR in all parameters except visibility of extracolonic lesions (two-dimensional) and visibility of colonic wall (three-dimensional) where there were no significant differences. There was no significant difference in polyp detection rates (P > .05). Doses: LD (dose-length product, 257.7), STD (dose-length product, 483.6). LD MBIR CTC objectively shows improved image noise using parameters in our study. Subjectively, image quality is maintained. Polyp detection shows no significant difference but because of small numbers needs further validation. Average dose reduction of 47% can be achieved. This study confirms feasibility of using MBIR in this context of CTC in symptomatic population. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  12. Predictive Fusion of Geophysical Waveforms using Fisher's Method, under the Alternative Hypothesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel; Nemzek, Robert James; Webster, Jeremy David

    2017-05-05

    This presentation tries to understand how to combine different signatures from an event or source together in a defensible way. The objective was to build a digital detector that continuously combines detection statistics recording explosions to screen sources of interest from null sources.

  13. Statistical lamb wave localization based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  14. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  15. Impact of Immediate Interpretation of Screening Tomosynthesis Mammography on Performance Metrics.

    PubMed

    Winkler, Nicole S; Freer, Phoebe; Anzai, Yoshimi; Hu, Nan; Stein, Matthew

    2018-05-07

    This study aimed to compare performance metrics for immediate and delayed batch interpretation of screening tomosynthesis mammograms. This HIPAA compliant study was approved by institutional review board with a waiver of consent. A retrospective analysis of screening performance metrics for tomosynthesis mammograms interpreted in 2015 when mammograms were read immediately was compared to historical controls from 2013 to 2014 when mammograms were batch interpreted after the patient had departed. A total of 5518 screening tomosynthesis mammograms (n = 1212 for batch interpretation and n = 4306 for immediate interpretation) were evaluated. The larger sample size for the latter group reflects a group practice shift to performing tomosynthesis for the majority of patients. Age, breast density, comparison examinations, and high-risk status were compared. An asymptotic proportion test and multivariable analysis were used to compare performance metrics. There was no statistically significant difference in recall or cancer detection rates for the batch interpretation group compared to immediate interpretation group with respective recall rate of 6.5% vs 5.3% = +1.2% (95% confidence interval -0.3 to 2.7%; P = .101) and cancer detection rate of 6.6 vs 7.2 per thousand = -0.6 (95% confidence interval -5.9 to 4.6; P = .825). There was no statistically significant difference in positive predictive values (PPVs) including PPV1 (screening recall), PPV2 (biopsy recommendation), or PPV 3 (biopsy performed) with batch interpretation (10.1%, 42.1%, and 40.0%, respectively) and immediate interpretation (13.6%, 39.2%, and 39.7%, respectively). After adjusting for age, breast density, high-risk status, and comparison mammogram, there was no difference in the odds of being recalled or cancer detection between the two groups. There is no statistically significant difference in interpretation performance metrics for screening tomosynthesis mammograms interpreted immediately compared to those interpreted in a delayed fashion. Copyright © 2018. Published by Elsevier Inc.

  16. Electroencephalogram Signal Classification for Automated Epileptic Seizure Detection Using Genetic Algorithm

    PubMed Central

    Nanthini, B. Suguna; Santhi, B.

    2017-01-01

    Background: Epilepsy causes when the repeated seizure occurs in the brain. Electroencephalogram (EEG) test provides valuable information about the brain functions and can be useful to detect brain disorder, especially for epilepsy. In this study, application for an automated seizure detection model has been introduced successfully. Materials and Methods: The EEG signals are decomposed into sub-bands by discrete wavelet transform using db2 (daubechies) wavelet. The eight statistical features, the four gray level co-occurrence matrix and Renyi entropy estimation with four different degrees of order, are extracted from the raw EEG and its sub-bands. Genetic algorithm (GA) is used to select eight relevant features from the 16 dimension features. The model has been trained and tested using support vector machine (SVM) classifier successfully for EEG signals. The performance of the SVM classifier is evaluated for two different databases. Results: The study has been experimented through two different analyses and achieved satisfactory performance for automated seizure detection using relevant features as the input to the SVM classifier. Conclusion: Relevant features using GA give better accuracy performance for seizure detection. PMID:28781480

  17. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity

    PubMed Central

    Narayan, Manjari; Allen, Genevera I.

    2016-01-01

    Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940

  18. Proximal caries detection: Sirona Sidexis versus Kodak Ektaspeed Plus.

    PubMed

    Khan, Emad A; Tyndall, Donald A; Ludlow, John B; Caplan, Daniel

    2005-01-01

    This study compared the accuracy of intraoral film and a charge-coupled device (CCD) receptor for proximal caries detection. Four observers evaluated images of the proximal surfaces of 40 extracted posterior teeth. The presence or absence of caries was scored using a five-point confidence scale. The actual status of each surface was determined from ground section histology. Responses were evaluated by means of receiver operating characteristic (ROC) analysis. Areas under ROC curves (Az) were assessed through a paired t-test. The performance of the CCD-based intraoral sensor was not different statistically from Ektaspeed Plus film in detecting proximal caries.

  19. Interpretation of the rainbow color scale for quantitative medical imaging: perceptually linear color calibration (CSDF) versus DICOM GSDF

    NASA Astrophysics Data System (ADS)

    Chesterman, Frédérique; Manssens, Hannah; Morel, Céline; Serrell, Guillaume; Piepers, Bastian; Kimpe, Tom

    2017-03-01

    Medical displays for primary diagnosis are calibrated to the DICOM GSDF1 but there is no accepted standard today that describes how display systems for medical modalities involving color should be calibrated. Recently the Color Standard Display Function3,4 (CSDF), a calibration using the CIEDE2000 color difference metric to make a display as perceptually linear as possible has been proposed. In this work we present the results of a first observer study set up to investigate the interpretation accuracy of a rainbow color scale when a medical display is calibrated to CSDF versus DICOM GSDF and a second observer study set up to investigate the detectability of color differences when a medical display is calibrated to CSDF, DICOM GSDF and sRGB. The results of the first study indicate that the error when interpreting a rainbow color scale is lower for CSDF than for DICOM GSDF with statistically significant difference (Mann-Whitney U test) for eight out of twelve observers. The results correspond to what is expected based on CIEDE2000 color differences between consecutive colors along the rainbow color scale for both calibrations. The results of the second study indicate a statistical significant improvement in detecting color differences when a display is calibrated to CSDF compared to DICOM GSDF and a (non-significant) trend indicating improved detection for CSDF compared to sRGB. To our knowledge this is the first work that shows the added value of a perceptual color calibration method (CSDF) in interpreting medical color images using the rainbow color scale. Improved interpretation of the rainbow color scale may be beneficial in the area of quantitative medical imaging (e.g. PET SUV, quantitative MRI and CT and doppler US), where a medical specialist needs to interpret quantitative medical data based on a color scale and/or detect subtle color differences and where improved interpretation accuracy and improved detection of color differences may contribute to a better diagnosis. Our results indicate that for diagnostic applications involving both grayscale and color images, CSDF should be chosen over DICOM GSDF and sRGB as it assures excellent detection for color images and at the same time maintains DICOM GSDF for grayscale images.

  20. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    NASA Astrophysics Data System (ADS)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  1. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1986-01-01

    The reliability of microfocous X-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 precent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  2. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1985-01-01

    The reliability of microfocus x-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 percent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  3. Evaluation of different time domain peak models using extreme learning machine-based peak detection for EEG signal.

    PubMed

    Adam, Asrul; Ibrahim, Zuwairie; Mokhtar, Norrima; Shapiai, Mohd Ibrahim; Cumming, Paul; Mubin, Marizan

    2016-01-01

    Various peak models have been introduced to detect and analyze peaks in the time domain analysis of electroencephalogram (EEG) signals. In general, peak model in the time domain analysis consists of a set of signal parameters, such as amplitude, width, and slope. Models including those proposed by Dumpala, Acir, Liu, and Dingle are routinely used to detect peaks in EEG signals acquired in clinical studies of epilepsy or eye blink. The optimal peak model is the most reliable peak detection performance in a particular application. A fair measure of performance of different models requires a common and unbiased platform. In this study, we evaluate the performance of the four different peak models using the extreme learning machine (ELM)-based peak detection algorithm. We found that the Dingle model gave the best performance, with 72 % accuracy in the analysis of real EEG data. Statistical analysis conferred that the Dingle model afforded significantly better mean testing accuracy than did the Acir and Liu models, which were in the range 37-52 %. Meanwhile, the Dingle model has no significant difference compared to Dumpala model.

  4. A statistical study over Europe of the relative locations of lightning and associated energetic burst of electrons from the radiation belt

    NASA Astrophysics Data System (ADS)

    Bourriez, F.; Sauvaud, J.-A.; Pinçon, J.-L.; Berthelier, J.-J.; Parrot, M.

    2016-02-01

    The DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions) spacecraft detects short bursts of lightning-induced electron precipitation (LEP) simultaneously with newly injected upgoing whistlers. The LEP occurs within < 1 s of the causative lightning discharge. First in situ observations of the size and location of the region affected by the LEP precipitation are presented on the basis of a statistical study made over Europe using the DEMETER energetic particle detector, wave electric field experiment, and networks of lightning detection (Météorage, the UK Met Office Arrival Time Difference network (ATDnet), and the World Wide Lightning Location Network (WWLLN)). The LEP is shown to occur significantly north of the initial lightning and extends over some 1000 km on each side of the longitude of the lightning. In agreement with models of electron interaction with obliquely propagating lightning-generated whistlers, the distance from the LEP to the lightning decreases as lightning proceed to higher latitudes.

  5. A comparator-hypothesis account of biased contingency detection.

    PubMed

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Detection of Undocumented Changepoints Using Multiple Test Statistics and Composite Reference Series.

    NASA Astrophysics Data System (ADS)

    Menne, Matthew J.; Williams, Claude N., Jr.

    2005-10-01

    An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.

  7. Hyperspectral target detection using heavy-tailed distributions

    NASA Astrophysics Data System (ADS)

    Willis, Chris J.

    2009-09-01

    One promising approach to target detection in hyperspectral imagery exploits a statistical mixture model to represent scene content at a pixel level. The process then goes on to look for pixels which are rare, when judged against the model, and marks them as anomalies. It is assumed that military targets will themselves be rare and therefore likely to be detected amongst these anomalies. For the typical assumption of multivariate Gaussianity for the mixture components, the presence of the anomalous pixels within the training data will have a deleterious effect on the quality of the model. In particular, the derivation process itself is adversely affected by the attempt to accommodate the anomalies within the mixture components. This will bias the statistics of at least some of the components away from their true values and towards the anomalies. In many cases this will result in a reduction in the detection performance and an increased false alarm rate. This paper considers the use of heavy-tailed statistical distributions within the mixture model. Such distributions are better able to account for anomalies in the training data within the tails of their distributions, and the balance of the pixels within their central masses. This means that an improved model of the majority of the pixels in the scene may be produced, ultimately leading to a better anomaly detection result. The anomaly detection techniques are examined using both synthetic data and hyperspectral imagery with injected anomalous pixels. A range of results is presented for the baseline Gaussian mixture model and for models accommodating heavy-tailed distributions, for different parameterizations of the algorithms. These include scene understanding results, anomalous pixel maps at given significance levels and Receiver Operating Characteristic curves.

  8. Speckle noise reduction in SAR images ship detection

    NASA Astrophysics Data System (ADS)

    Yuan, Ji; Wu, Bin; Yuan, Yuan; Huang, Qingqing; Chen, Jingbo; Ren, Lin

    2012-09-01

    At present, there are two types of method to detect ships in SAR images. One is a direct detection type, detecting ships directly. The other is an indirect detection type. That is, it firstly detects ship wakes, and then seeks ships around wakes. The two types all effect by speckle noise. In order to improve the accuracy of ship detection and get accurate ship and ship wakes parameters, such as ship length, ship width, ship area, the angle of ship wakes and ship outline from SAR images, it is extremely necessary to remove speckle noise in SAR images before data used in various SAR images ship detection. The use of speckle noise reduction filter depends on the specification for a particular application. Some common filters are widely used in speckle noise reduction, such as the mean filter, the median filter, the lee filter, the enhanced lee filter, the Kuan filter, the frost filter, the enhanced frost filter and gamma filter, but these filters represent some disadvantages in SAR image ship detection because of the various types of ship. Therefore, a mathematical function known as the wavelet transform and multi-resolution analysis were used to localize an SAR ocean image into different frequency components or useful subbands, and effectively reduce the speckle in the subbands according to the local statistics within the bands. Finally, the analysis of the statistical results are presented, which demonstrates the advantages and disadvantages of using wavelet shrinkage techniques over standard speckle filters.

  9. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  10. Body Weight Reducing Effect of Oral Boric Acid Intake

    PubMed Central

    Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut

    2011-01-01

    Background: Boric acid is widely used in biology, but its body weight reducing effect is not researched. Methods: Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Results: Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Conclusion: Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear. PMID:22135611

  11. Body weight reducing effect of oral boric acid intake.

    PubMed

    Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut

    2011-01-01

    Boric acid is widely used in biology, but its body weight reducing effect is not researched. Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear.

  12. Evaluation of Oxidative Stress in Bipolar Disorder in terms of Total Oxidant Status, Total Antioxidant Status, and Oxidative Stress Index

    PubMed Central

    CİNGİ YİRÜN, Merve; ÜNAL, Kübranur; ALTUNSOY ŞEN, Neslihan; YİRÜN, Onur; AYDEMİR, Çiğdem; GÖKA, Erol

    2016-01-01

    Introduction Bipolar disorder is one of the most debilitating psychiatric disorders characterized by disruptive episodes of mania/hypomania and depression. Considering the complex role of biological and environmental factors in the etiology of affective disorders, recent studies have focused on oxidative stress, which may damage nerve cell components and take part in pathophysiology. The aim of the present study was to contribute to the data about oxidative stress in bipolar disorder by detecting the total antioxidant status (TAS), total oxidant status (TOS), and oxidative stress index (OSI) levels of manic episode (ME) and euthymic (EU) patients and by comparing these results with those of healthy controls (HCs). Methods The study population consisted of 28 EU outpatients meeting the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) criteria for bipolar disorder I and 23 inpatients who were currently hospitalized in a psychiatry ward with the diagnosis of the bipolar disorder ME according to the DSM-5 criteria. Forty-three healthy subjects were included in the study as the control group (HC). Serum TAS, TOS, and OSI levels of all the participants were determined. Results Statistical analysis of serum TAS, TOS, and OSI levels did not show any significant differences between the ME patients, EU patients, and HCs. Comparison between the bipolar disorder patients (ME+EU) and HC also did not reveal any statistically significant difference between these two groups in terms of serum TAS, TOS, and OSI levels. Conclusion To date, studies on oxidative stress in bipolar disorder have led to controversial results. In the present study, no statistically significant difference was detected between the oxidative parameters of bipolar disorder patients and HCs. In order to comprehensively evaluate oxidative stress in bipolar disorder, further studies are needed. PMID:28373794

  13. Evaluation of Oxidative Stress in Bipolar Disorder in terms of Total Oxidant Status, Total Antioxidant Status, and Oxidative Stress Index.

    PubMed

    Cingi Yirün, Merve; Ünal, Kübranur; Altunsoy Şen, Neslihan; Yirün, Onur; Aydemir, Çiğdem; Göka, Erol

    2016-09-01

    Bipolar disorder is one of the most debilitating psychiatric disorders characterized by disruptive episodes of mania/hypomania and depression. Considering the complex role of biological and environmental factors in the etiology of affective disorders, recent studies have focused on oxidative stress, which may damage nerve cell components and take part in pathophysiology. The aim of the present study was to contribute to the data about oxidative stress in bipolar disorder by detecting the total antioxidant status (TAS), total oxidant status (TOS), and oxidative stress index (OSI) levels of manic episode (ME) and euthymic (EU) patients and by comparing these results with those of healthy controls (HCs). The study population consisted of 28 EU outpatients meeting the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) criteria for bipolar disorder I and 23 inpatients who were currently hospitalized in a psychiatry ward with the diagnosis of the bipolar disorder ME according to the DSM-5 criteria. Forty-three healthy subjects were included in the study as the control group (HC). Serum TAS, TOS, and OSI levels of all the participants were determined. Statistical analysis of serum TAS, TOS, and OSI levels did not show any significant differences between the ME patients, EU patients, and HCs. Comparison between the bipolar disorder patients (ME+EU) and HC also did not reveal any statistically significant difference between these two groups in terms of serum TAS, TOS, and OSI levels. To date, studies on oxidative stress in bipolar disorder have led to controversial results. In the present study, no statistically significant difference was detected between the oxidative parameters of bipolar disorder patients and HCs. In order to comprehensively evaluate oxidative stress in bipolar disorder, further studies are needed.

  14. Hyperspectral Imaging in Tandem with R Statistics and Image Processing for Detection and Visualization of pH in Japanese Big Sausages Under Different Storage Conditions.

    PubMed

    Feng, Chao-Hui; Makino, Yoshio; Yoshimura, Masatoshi; Thuyet, Dang Quoc; García-Martín, Juan Francisco

    2018-02-01

    The potential of hyperspectral imaging with wavelengths of 380 to 1000 nm was used to determine the pH of cooked sausages after different storage conditions (4 °C for 1 d, 35 °C for 1, 3, and 5 d). The mean spectra of the sausages were extracted from the hyperspectral images and partial least squares regression (PLSR) model was developed to relate spectral profiles with the pH of the cooked sausages. Eleven important wavelengths were selected based on the regression coefficient values. The PLSR model established using the optimal wavelengths showed good precision being the prediction coefficient of determination (R p 2 ) 0.909 and the root mean square error of prediction 0.035. The prediction map for illustrating pH indices in sausages was for the first time developed by R statistics. The overall results suggested that hyperspectral imaging combined with PLSR and R statistics are capable to quantify and visualize the sausages pH evolution under different storage conditions. In this paper, hyperspectral imaging is for the first time used to detect pH in cooked sausages using R statistics, which provides another useful information for the researchers who do not have the access to Matlab. Eleven optimal wavelengths were successfully selected, which were used for simplifying the PLSR model established based on the full wavelengths. This simplified model achieved a high R p 2 (0.909) and a low root mean square error of prediction (0.035), which can be useful for the design of multispectral imaging systems. © 2017 Institute of Food Technologists®.

  15. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit.

  16. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  17. Statistical Models for the Analysis and Design of Digital Polymerase Chain Reaction (dPCR) Experiments.

    PubMed

    Dorazio, Robert M; Hunter, Margaret E

    2015-11-03

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  18. The Detection of Focal Liver Lesions Using Abdominal CT: A Comparison of Image Quality Between Adaptive Statistical Iterative Reconstruction V and Adaptive Statistical Iterative Reconstruction.

    PubMed

    Lee, Sangyun; Kwon, Heejin; Cho, Jihan

    2016-12-01

    To investigate image quality characteristics of abdominal computed tomography (CT) scans reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) vs currently using applied adaptive statistical iterative reconstruction (ASIR). This institutional review board-approved study included 35 consecutive patients who underwent CT of the abdomen. Among these 35 patients, 27 with focal liver lesions underwent abdomen CT with a 128-slice multidetector unit using the following parameters: fixed noise index of 30, 1.25 mm slice thickness, 120 kVp, and a gantry rotation time of 0.5 seconds. CT images were analyzed depending on the method of reconstruction: ASIR (30%, 50%, and 70%) vs ASIR-V (30%, 50%, and 70%). Three radiologists independently assessed randomized images in a blinded manner. Imaging sets were compared to focal lesion detection numbers, overall image quality, and objective noise with a paired sample t test. Interobserver agreement was assessed with the intraclass correlation coefficient. The detection of small focal liver lesions (<10 mm) was significantly higher when ASIR-V was used when compared to ASIR (P <0.001). Subjective image noise, artifact, and objective image noise in liver were generally significantly better for ASIR-V compared to ASIR, especially in 50% ASIR-V. Image sharpness and diagnostic acceptability were significantly worse in 70% ASIR-V compared to various levels of ASIR. Images analyzed using 50% ASIR-V were significantly better than three different series of ASIR or other ASIR-V conditions at providing diagnostically acceptable CT scans without compromising image quality and in the detection of focal liver lesions. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  19. Progress in the detection of neoplastic progress and cancer by Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Bakker Schut, Tom C.; Stone, Nicholas; Kendall, Catherine A.; Barr, Hugh; Bruining, Hajo A.; Puppels, Gerwin J.

    2000-05-01

    Early detection of cancer is important because of the improved survival rates when the cancer is treated early. We study the application of NIR Raman spectroscopy for detection of dysplasia because this technique is sensitive to the small changes in molecular invasive in vivo detection using fiber-optic probes. The result of an in vitro study to detect neoplastic progress of esophageal Barrett's esophageal tissue will be presented. Using multivariate statistics, we developed three different linear discriminant analysis classification models to predict tissue type on the basis of the measured spectrum. Spectra of normal, metaplastic and dysplasia tissue could be discriminated with an accuracy of up to 88 percent. Therefore Raman spectroscopy seems to be a very suitable technique to detect dysplasia in Barrett's esophageal tissue.

  20. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Ultrasound detection of simulated intra-ocular foreign bodies by minimally trained personnel.

    PubMed

    Sargsyan, Ashot E; Dulchavsky, Alexandria G; Adams, James; Melton, Shannon; Hamilton, Douglas R; Dulchavsky, Scott A

    2008-01-01

    To test the ability of non-expert ultrasound operators of divergent backgrounds to detect the presence, size, location, and composition of foreign bodies in an ocular model. High school students (N = 10) and NASA astronauts (N = 4) completed a brief ultrasound training session which focused on basic ultrasound principles and the detection of foreign bodies. The operators used portable ultrasound devices to detect foreign objects of varying location, size (0.5-2 mm), and material (glass, plastic, metal) in a gelatinous ocular model. Operator findings were compared to known foreign object parameters and ultrasound experts (N = 2) to determine accuracy across and between groups. Ultrasound had high sensitivity (astronauts 85%, students 87%, and experts 100%) and specificity (astronauts 81%, students 83%, and experts 95%) for the detection of foreign bodies. All user groups were able to accurately detect the presence of foreign bodies in this model (astronauts 84%, students 81%, and experts 97%). Astronaut and student sensitivity results for material (64% vs. 48%), size (60% vs. 46%), and position (77% vs. 64%) were not statistically different. Experts' results for material (85%), size (90%), and position (98%) were higher; however, the small sample size precluded statistical conclusions. Ultrasound can be used by operators with varying training to detect the presence, location, and composition of intraocular foreign bodies with high sensitivity, specificity, and accuracy.

  2. Application of Scan Statistics to Detect Suicide Clusters in Australia

    PubMed Central

    Cheung, Yee Tak Derek; Spittal, Matthew J.; Williamson, Michelle Kate; Tung, Sui Jay; Pirkis, Jane

    2013-01-01

    Background Suicide clustering occurs when multiple suicide incidents take place in a small area or/and within a short period of time. In spite of the multi-national research attention and particular efforts in preparing guidelines for tackling suicide clusters, the broader picture of epidemiology of suicide clustering remains unclear. This study aimed to develop techniques in using scan statistics to detect clusters, with the detection of suicide clusters in Australia as example. Methods and Findings Scan statistics was applied to detect clusters among suicides occurring between 2004 and 2008. Manipulation of parameter settings and change of area for scan statistics were performed to remedy shortcomings in existing methods. In total, 243 suicides out of 10,176 (2.4%) were identified as belonging to 15 suicide clusters. These clusters were mainly located in the Northern Territory, the northern part of Western Australia, and the northern part of Queensland. Among the 15 clusters, 4 (26.7%) were detected by both national and state cluster detections, 8 (53.3%) were only detected by the state cluster detection, and 3 (20%) were only detected by the national cluster detection. Conclusions These findings illustrate that the majority of spatial-temporal clusters of suicide were located in the inland northern areas, with socio-economic deprivation and higher proportions of indigenous people. Discrepancies between national and state/territory cluster detection by scan statistics were due to the contrast of the underlying suicide rates across states/territories. Performing both small-area and large-area analyses, and applying multiple parameter settings may yield the maximum benefits for exploring clusters. PMID:23342098

  3. Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks

    DTIC Science & Technology

    2016-04-26

    REPORT TYPE Final 3. DATES COVERED (From - To) 15 Oct 2014 to 14 Jan 2015 4. TITLE AND SUBTITLE Detecting statistically significant clusters of...extend the work of Perry et al. [6] by developing a statistical framework that supports the detection of triangle motif-based clusters in complex...priori, the need for triangle motif-based clustering . 2. Developed an algorithm for clustering undirected networks, where the triangle con guration was

  4. Bounds on the minimum number of recombination events in a sample history.

    PubMed Central

    Myers, Simon R; Griffiths, Robert C

    2003-01-01

    Recombination is an important evolutionary factor in many organisms, including humans, and understanding its effects is an important task facing geneticists. Detecting past recombination events is thus important; this article introduces statistics that give a lower bound on the number of recombination events in the history of a sample, on the basis of the patterns of variation in the sample DNA. Such lower bounds are appropriate, since many recombination events in the history are typically undetectable, so the true number of historical recombinations is unobtainable. The statistics can be calculated quickly by computer and improve upon the earlier bound of Hudson and Kaplan 1985. A method is developed to combine bounds on local regions in the data to produce more powerful improved bounds. The method is flexible to different models of recombination occurrence. The approach gives recombination event bounds between all pairs of sites, to help identify regions with more detectable recombinations, and these bounds can be viewed graphically. Under coalescent simulations, there is a substantial improvement over the earlier method (of up to a factor of 2) in the expected number of recombination events detected by one of the new minima, across a wide range of parameter values. The method is applied to data from a region within the lipoprotein lipase gene and the amount of detected recombination is substantially increased. Further, there is strong clustering of detected recombination events in an area near the center of the region. A program implementing these statistics, which was used for this article, is available from http://www.stats.ox.ac.uk/mathgen/programs.html. PMID:12586723

  5. Steady State Fluorescence Spectroscopy for Medical Diagnosis

    NASA Astrophysics Data System (ADS)

    Mahadevan-Jansen, Anita; Gebhart, Steven C.

    Light can react with tissue in different ways and provide information for identifying the physiological state of tissue or detecting the presence of disease. The light used to probe tissue does so in a non-intrusive manner and typically uses very low levels of light far below the requirements for therapeutic applications. The use of fiber optics simplifies the delivery and collection of this light in a minimally invasive manner. Since tissue response is virtually instantaneous, the results are obtained in real-time and the use of data processing techniques and multi-variate statistical analysis allows for automated detection and therefore provides an objective estimation of the tissue state. These then form the fundamental basis for the application of optical techniques for the detection of tissue physiology as well as pathology. These distinct advantages have encouraged many researchers to pursue the development of the different optical interactions for biological and medical detection.

  6. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    PubMed

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.

  7. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video

    PubMed Central

    Ghosh, Tonmoy; Wahid, Khan A.

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data. PMID:29468094

  8. Comparison of Measures of Predictive Power.

    ERIC Educational Resources Information Center

    Tarling, Roger

    1982-01-01

    The Mean Cost Rating, P(A) from Signal Detection Theory, Kendall's rank correlation coefficient tau, and Goodman and Kruskal's gamma measures of predictive power are compared and shown to be different transformations of the statistic S. Gamma is generally preferred for hypothesis testing. Measures of association for ordered contingency tables are…

  9. Effective Analysis of Reaction Time Data

    ERIC Educational Resources Information Center

    Whelan, Robert

    2008-01-01

    Most analyses of reaction time (RT) data are conducted by using the statistical techniques with which psychologists are most familiar, such as analysis of variance on the sample mean. Unfortunately, these methods are usually inappropriate for RT data, because they have little power to detect genuine differences in RT between conditions. In…

  10. Bioassessment Tools for Stony Corals: Monitoring Approaches and Proposed Sampling Plan for the U.S. Virgin Islands

    EPA Science Inventory

    This document describes three general approaches to the design of a sampling plan for biological monitoring of coral reefs. Status assessment, trend detection and targeted monitoring each require a different approach to site selection and statistical analysis. For status assessm...

  11. Thirty Years of Vegetation Change in the Coastal Santa Cruz Mountains of Northern California Detected Using Landsat Satellite Image Analysis

    NASA Technical Reports Server (NTRS)

    Potter, Christopher

    2015-01-01

    Results from Landsat satellite image times series analysis since 1983 of this study area showed gradual, statistically significant increases in the normalized difference vegetation index (NDVI) in more than 90% of the (predominantly second-growth) evergreen forest locations sampled.

  12. Spatial-temporal event detection in climate parameter imagery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less

  13. Intravitreal invading cells contribute to vitreal cytokine milieu in proliferative vitreoretinopathy

    PubMed Central

    El-Ghrably, I; Dua, H.; Orr, G.; Fischer, D.; Tighe, P.

    2001-01-01

    AIM—To examine the contribution of infiltrating cells in the local production of cytokines within the vitreous of patients with proliferative vitreoretinopathy (PVR).
METHODS—The presence of mRNA coding for IL-6, IL-8, IL-1β, IL-1α, TNFα, IFNγ, IL-12, and HPRT was investigated in 25 vitreous samples from patients with PVR, 11 vitreous samples from patients with retinal detachment (RD) not complicated by PVR, and 10 vitreous samples from patients with macular hole (MH). A quantitative reverse transcriptase polymerase chain reaction (RT-PCR) using an internal competitor was used to investigate these samples. From these samples, 15 PVR, 8 RD, and 8 MH were analysed for the protein levels of the same cytokines using enzyme linked immunosorbent assay (ELISA). Spearman correlation was used to test any association between mRNA and cytokine protein levels, as an indicator of the contribution these cells make to the intravitreal cytokine milieu.
RESULTS—A strong correlation was found between mRNA and their respective cytokine levels (protein products) for IL-6, IL-8, IL-1β, IL-1α, TNFα, IFNγ (Spearman r = 0.83, 0.73, 0.67, 0.91, 0.73, and 0.73 respectively), but not for IL-12. The median levels of IL-6, IL-8, IL-1β, and IFNγ mRNA and their respective cytokines were significantly higher (p <0.05) in patients with PVR than in those with macular hole. There was no statistically significant difference in the median levels of IL-1α mRNA between PVR and MH but the cytokine IL-1α was detected at a significantly higher level in PVR compared with MH patients. Between PVR and RD patients, there was no statistically significant difference in mRNA levels for all the investigated cytokines (p >0.05) except for IL-6 where there was a statistical significance (p= 0.038). In contrast, the median levels of IL-6, IL-8, and IL-1β cytokines were significantly higher (p <0.05) in patients with PVR than in those with RD, whereas for IL-1α and IFNγ no significant statistical difference was detected between PVR and RD patients (p >0.05). When results of RD and MH patients were compared, a statistical difference was only detected in mRNA levels of INFγ (p = 0.008). However, no difference was detected for INFγ (protein product) or for any of the other cytokines between RD and MH patients.
CONCLUSION—Levels of both protein and mRNA encoding IL-6, IL-8, IL-1β, and IFNγ is significantly increased in vitreous samples from patients with PVR. The strong correlation between ELISA detectable cytokines (protein products) and their respective mRNA levels suggest that intravitreal, invasive cells are the major source of these cytokines, with the exception of IL-12. Cells invading the vitreous do not appear to locally produce IL-12 mRNA. This would appear to implicate cells peripheral to the vitreal mass as the major source of this cytokine.

 PMID:11264138

  14. Comparison of culture, single and multiplex real-time PCR for detection of Sabin poliovirus shedding in recently vaccinated Indian children.

    PubMed

    Giri, Sidhartha; Rajan, Anand K; Kumar, Nirmal; Dhanapal, Pavithra; Venkatesan, Jayalakshmi; Iturriza-Gomara, Miren; Taniuchi, Mami; John, Jacob; Abraham, Asha Mary; Kang, Gagandeep

    2017-08-01

    Although, culture is considered the gold standard for poliovirus detection from stool samples, real-time PCR has emerged as a faster and more sensitive alternative. Detection of poliovirus from the stool of recently vaccinated children by culture, single and multiplex real-time PCR was compared. Of the 80 samples tested, 55 (68.75%) were positive by culture compared to 61 (76.25%) and 60 (75%) samples by the single and one step multiplex real-time PCR assays respectively. Real-time PCR (singleplex and multiplex) is more sensitive than culture for poliovirus detection in stool, although the difference was not statistically significant. © 2017 Wiley Periodicals, Inc.

  15. A robust and efficient statistical method for genetic association studies using case and control samples from multiple cohorts

    PubMed Central

    2013-01-01

    Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771

  16. Statistics of Shared Components in Complex Component Systems

    NASA Astrophysics Data System (ADS)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  17. Consideraciones para la estimacion de abundancia de poblaciones de mamiferos. [Considerations for the estimation of abundance of mammal populations.

    USGS Publications Warehouse

    Walker, R.S.; Novare, A.J.; Nichols, J.D.

    2000-01-01

    Estimation of abundance of mammal populations is essential for monitoring programs and for many ecological investigations. The first step for any study of variation in mammal abundance over space or time is to define the objectives of the study and how and why abundance data are to be used. The data used to estimate abundance are count statistics in the form of counts of animals or their signs. There are two major sources of uncertainty that must be considered in the design of the study: spatial variation and the relationship between abundance and the count statistic. Spatial variation in the distribution of animals or signs may be taken into account with appropriate spatial sampling. Count statistics may be viewed as random variables, with the expected value of the count statistic equal to the true abundance of the population multiplied by a coefficient p. With direct counts, p represents the probability of detection or capture of individuals, and with indirect counts it represents the rate of production of the signs as well as their probability of detection. Comparisons of abundance using count statistics from different times or places assume that the p are the same for all times or places being compared (p= pi). In spite of considerable evidence that this assumption rarely holds true, it is commonly made in studies of mammal abundance, as when the minimum number alive or indices based on sign counts are used to compare abundance in different habitats or times. Alternatives to relying on this assumption are to calibrate the index used by testing the assumption of p= pi, or to incorporate the estimation of p into the study design.

  18. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  19. Significance of noisy signals in periodograms

    NASA Astrophysics Data System (ADS)

    Süveges, Maria

    2015-08-01

    The detection of tiny periodic signals in noisy and irregularly sampled time series is a challenging task. Once a small peak is found in the periodogram, the next step is to see how probable it is that pure noise produced a peak so extreme - that is to say, compute its False Alarm Probability (FAP). This useful measure quantifies the statistical plausibility of the found signal among the noise. However, its derivation from statistical principles is very hard due to the specificities of astronomical periodograms, such as oversampling and the ensuing strong correlation among its values at different frequencies. I will present a method to compute the FAP based on extreme-value statistics (Süveges 2014), and compare it to two other methods proposed by Baluev (2008) and Paltani (2004) and Schwarzenberg-Czerny (2012) on signals with various signal shapes and at different signal-to-noise ratios.

  20. Threshold-free high-power methods for the ontological analysis of genome-wide gene-expression studies

    PubMed Central

    Nilsson, Björn; Håkansson, Petra; Johansson, Mikael; Nelander, Sven; Fioretos, Thoas

    2007-01-01

    Ontological analysis facilitates the interpretation of microarray data. Here we describe new ontological analysis methods which, unlike existing approaches, are threshold-free and statistically powerful. We perform extensive evaluations and introduce a new concept, detection spectra, to characterize methods. We show that different ontological analysis methods exhibit distinct detection spectra, and that it is critical to account for this diversity. Our results argue strongly against the continued use of existing methods, and provide directions towards an enhanced approach. PMID:17488501

  1. Mutual Information in Frequency and Its Application to Measure Cross-Frequency Coupling in Epilepsy

    NASA Astrophysics Data System (ADS)

    Malladi, Rakesh; Johnson, Don H.; Kalamangalam, Giridhar P.; Tandon, Nitin; Aazhang, Behnaam

    2018-06-01

    We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological recordings from the brain to infer cross-frequency coupling. The current metrics used to quantify the cross-frequency coupling in neuroscience cannot detect if two frequency components in non-Gaussian brain recordings are statistically independent or not. Our MI-in-frequency metric, based on Shannon's mutual information between the Cramer's representation of stochastic processes, overcomes this shortcoming and can detect statistical dependence in frequency between non-Gaussian signals. We then describe two data-driven estimators of MI-in-frequency: one based on kernel density estimation and the other based on the nearest neighbor algorithm and validate their performance on simulated data. We then use MI-in-frequency to estimate mutual information between two data streams that are dependent across time, without making any parametric model assumptions. Finally, we use the MI-in- frequency metric to investigate the cross-frequency coupling in seizure onset zone from electrocorticographic recordings during seizures. The inferred cross-frequency coupling characteristics are essential to optimize the spatial and spectral parameters of electrical stimulation based treatments of epilepsy.

  2. Statistical Methods for Generalized Linear Models with Covariates Subject to Detection Limits.

    PubMed

    Bernhardt, Paul W; Wang, Huixia J; Zhang, Daowen

    2015-05-01

    Censored observations are a common occurrence in biomedical data sets. Although a large amount of research has been devoted to estimation and inference for data with censored responses, very little research has focused on proper statistical procedures when predictors are censored. In this paper, we consider statistical methods for dealing with multiple predictors subject to detection limits within the context of generalized linear models. We investigate and adapt several conventional methods and develop a new multiple imputation approach for analyzing data sets with predictors censored due to detection limits. We establish the consistency and asymptotic normality of the proposed multiple imputation estimator and suggest a computationally simple and consistent variance estimator. We also demonstrate that the conditional mean imputation method often leads to inconsistent estimates in generalized linear models, while several other methods are either computationally intensive or lead to parameter estimates that are biased or more variable compared to the proposed multiple imputation estimator. In an extensive simulation study, we assess the bias and variability of different approaches within the context of a logistic regression model and compare variance estimation methods for the proposed multiple imputation estimator. Lastly, we apply several methods to analyze the data set from a recently-conducted GenIMS study.

  3. Applying a statistical PTB detection procedure to complement the gold standard.

    PubMed

    Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd

    2011-04-01

    This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Inverse statistics and information content

    NASA Astrophysics Data System (ADS)

    Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.

    2010-12-01

    Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.

  5. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  6. Examples of sex/gender sensitivity in epidemiological research: results of an evaluation of original articles published in JECH 2006-2014.

    PubMed

    Jahn, Ingeborg; Börnhorst, Claudia; Günther, Frauke; Brand, Tilman

    2017-02-15

    During the last decades, sex and gender biases have been identified in various areas of biomedical and public health research, leading to compromised validity of research findings. As a response, methodological requirements were developed but these are rarely translated into research practice. The aim of this study is to provide good practice examples of sex/gender sensitive health research. We conducted a systematic search of research articles published in JECH between 2006 and 2014. An instrument was constructed to evaluate sex/gender sensitivity in four stages of the research process (background, study design, statistical analysis, discussion). In total, 37 articles covering diverse topics were included. Thereof, 22 were evaluated as good practice example in at least one stage; two articles achieved highest ratings across all stages. Good examples of the background referred to available knowledge on sex/gender differences and sex/gender informed theoretical frameworks. Related to the study design, good examples calculated sample sizes to be able to detect sex/gender differences, selected sex/gender sensitive outcome/exposure indicators, or chose different cut-off values for male and female participants. Good examples of statistical analyses used interaction terms with sex/gender or different shapes of the estimated relationship for men and women. Examples of good discussions interpreted their findings related to social and biological explanatory models or questioned the statistical methods used to detect sex/gender differences. The identified good practice examples may inspire researchers to critically reflect on the relevance of sex/gender issues of their studies and help them to translate methodological recommendations of sex/gender sensitivity into research practice.

  7. Study on the relationship between the methylation of the MMP-9 gene promoter region and diabetic nephropathy.

    PubMed

    Yang, Xiao-Hui; Feng, Shi-Ya; Yu, Yang; Liang, Zhou

    2018-01-01

    This study aims to explore the relationship between the methylation of matrix metalloproteinase (MMP)-9 gene promoter region and diabetic nephropathy (DN) through the detection of the methylation level of MMP-9 gene promoter region in the peripheral blood of patients with DN in different periods and serum MMP-9 concentration. The methylation level of the MMP-9 gene promoter region was detected by methylation-specific polymerase chain reaction (MSP), and the content of MMP-9 in serum was determined by enzyme-linked immunosorbent assay (ELISA). Results of the statistical analysis revealed that serum MMP-9 protein expression levels gradually increased in patients in the simple diabetic group, early diabetic nephropathy group and clinical diabetic nephropathy group, compared with the control group; and the difference was statistically significant (P < 0.05). Compared with the control group, the methylation levels of MMP-9 gene promoter regions gradually decreased in patients in the simple diabetic group, early diabetic nephropathy group, and clinical diabetic nephropathy group; and the difference was statistically significant (P < 0.05). Furthermore, correlation analysis results indicated that the demethylation levels of the MMP-9 gene promoter region was positively correlated with serum protein levels, urinary albumin to creatinine ratio (UACR), urea and creatinine; and was negatively correlated with GFR. The demethylation of the MMP-9 gene promoter region may be involved in the occurrence and development of diabetic nephropathy by regulating the expression of MMP-9 protein in serum.

  8. Transportable data from non-target arthropod field studies for the environmental risk assessment of genetically modified maize expressing an insecticidal double-stranded RNA.

    PubMed

    Ahmad, Aqeel; Negri, Ignacio; Oliveira, Wladecir; Brown, Christopher; Asiimwe, Peter; Sammons, Bernard; Horak, Michael; Jiang, Changjian; Carson, David

    2016-02-01

    As part of an environmental risk assessment, the potential impact of genetically modified (GM) maize MON 87411 on non-target arthropods (NTAs) was evaluated in the field. MON 87411 confers resistance to corn rootworm (CRW; Diabrotica spp.) by expressing an insecticidal double-stranded RNA (dsRNA) transcript and the Cry3Bb1 protein and tolerance to the herbicide glyphosate by producing the CP4 EPSPS protein. Field trials were conducted at 14 sites providing high geographic and environmental diversity within maize production areas from three geographic regions including the U.S., Argentina, and Brazil. MON 87411, the conventional control, and four commercial conventional reference hybrids were evaluated for NTA abundance and damage. Twenty arthropod taxa met minimum abundance criteria for valid statistical analysis. Nine of these taxa occurred in at least two of the three regions and in at least four sites across regions. These nine taxa included: aphid, predatory earwig, lacewing, ladybird beetle, leafhopper, minute pirate bug, parasitic wasp, sap beetle, and spider. In addition to wide regional distribution, these taxa encompass the ecological functions of herbivores, predators and parasitoids in maize agro-ecosystems. Thus, the nine arthropods may serve as representative taxa of maize agro-ecosystems, and thereby support that analysis of relevant data generated in one region can be transportable for the risk assessment of the same or similar GM crop products in another region. Across the 20 taxa analyzed, no statistically significant differences in abundance were detected between MON 87411 and the conventional control for 123 of the 128 individual-site comparisons (96.1%). For the nine widely distributed taxa, no statistically significant differences in abundance were detected between MON 87411 and the conventional control. Furthermore, no statistically significant differences were detected between MON 87411 and the conventional control for 53 out of 56 individual-site comparisons (94.6 %) of NTA pest damage to the crop. In each case where a significant difference was observed in arthropod abundance or damage, the mean value for MON 87411 was within the reference range and/or the difference was not consistently observed across collection methods and/or sites. Thus, the differences were not representative of an adverse effect unfamiliar to maize and/or were not indicative of a consistent plant response associated with the GM traits. Results from this study support a conclusion of no adverse environmental impact of MON 87411 on NTAs compared to conventional maize and demonstrate the utility of relevant transportable data across regions for the ERA of GM crops.

  9. Statistical properties of trading activity in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoqian; Cheng, Xueqi; Shen, Huawei; Wang, Zhaoyang

    2010-08-01

    We investigate the statistical properties of traders' trading behavior using cumulative distribution function(CDF). We analyze exchange data of 52 stocks for one-year period which contains non-manipulated stocks and manipulated stocks published by China Securities Regulatory Commission(CSRC). By analyzing the total number of transactions and the trading volume of each trader over a year, we find the cumulative distributions have power-law tails and the distributions between non-manipulated stocks and manipulated stocks are different. These findings can help us to detect the manipulated stocks.

  10. Four modes of optical parametric operation for squeezed state generation

    NASA Astrophysics Data System (ADS)

    Andersen, U. L.; Buchler, B. C.; Lam, P. K.; Wu, J. W.; Gao, J. R.; Bachor, H.-A.

    2003-11-01

    We report a versatile instrument, based on a monolithic optical parametric amplifier, which reliably generates four different types of squeezed light. We obtained vacuum squeezing, low power amplitude squeezing, phase squeezing and bright amplitude squeezing. We show a complete analysis of this light, including a full quantum state tomography. In addition we demonstrate the direct detection of the squeezed state statistics without the aid of a spectrum analyser. This technique makes the nonclassical properties directly visible and allows complete measurement of the statistical moments of the squeezed quadrature.

  11. Change detection in a time series of polarimetric SAR data by an omnibus test statistic and its factorization (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan A.; Conradsen, Knut; Skriver, Henning

    2016-10-01

    Test statistics for comparison of real (as opposed to complex) variance-covariance matrices exist in the statistics literature [1]. In earlier publications we have described a test statistic for the equality of two variance-covariance matrices following the complex Wishart distribution with an associated p-value [2]. We showed their application to bitemporal change detection and to edge detection [3] in multilook, polarimetric synthetic aperture radar (SAR) data in the covariance matrix representation [4]. The test statistic and the associated p-value is described in [5] also. In [6] we focussed on the block-diagonal case, we elaborated on some computer implementation issues, and we gave examples on the application to change detection in both full and dual polarization bitemporal, bifrequency, multilook SAR data. In [7] we described an omnibus test statistic Q for the equality of k variance-covariance matrices following the complex Wishart distribution. We also described a factorization of Q = R2 R3 … Rk where Q and Rj determine if and when a difference occurs. Additionally, we gave p-values for Q and Rj. Finally, we demonstrated the use of Q and Rj and the p-values to change detection in truly multitemporal, full polarization SAR data. Here we illustrate the methods by means of airborne L-band SAR data (EMISAR) [8,9]. The methods may be applied to other polarimetric SAR data also such as data from Sentinel-1, COSMO-SkyMed, TerraSAR-X, ALOS, and RadarSat-2 and also to single-pol data. The account given here closely follows that given our recent IEEE TGRS paper [7]. Selected References [1] Anderson, T. W., An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third ed. (2003). [2] Conradsen, K., Nielsen, A. A., Schou, J., and Skriver, H., "A test statistic in the complex Wishart distribution and its application to change detection in polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 41(1): 4-19, 2003. [3] Schou, J., Skriver, H., Nielsen, A. A., and Conradsen, K., "CFAR edge detector for polarimetric SAR images," IEEE Transactions on Geoscience and Remote Sensing 41(1): 20-32, 2003. [4] van Zyl, J. J. and Ulaby, F. T., "Scattering matrix representation for simple targets," in Radar Polarimetry for Geoscience Applications, Ulaby, F. T. and Elachi, C., eds., Artech, Norwood, MA (1990). [5] Canty, M. J., Image Analysis, Classification and Change Detection in Remote Sensing,with Algorithms for ENVI/IDL and Python, Taylor & Francis, CRC Press, third revised ed. (2014). [6] Nielsen, A. A., Conradsen, K., and Skriver, H., "Change detection in full and dual polarization, single- and multi-frequency SAR data," IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 8(8): 4041-4048, 2015. [7] Conradsen, K., Nielsen, A. A., and Skriver, H., "Determining the points of change in time series of polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 54(5), 3007-3024, 2016. [9] Christensen, E. L., Skou, N., Dall, J., Woelders, K., rgensen, J. H. J., Granholm, J., and Madsen, S. N., "EMISAR: An absolutely calibrated polarimetric L- and C-band SAR," IEEE Transactions on Geoscience and Remote Sensing 36: 1852-1865 (1998).

  12. Detection of foreign substances in food using thermography

    NASA Astrophysics Data System (ADS)

    Meinlschmidt, Peter; Maergner, Volker

    2002-03-01

    This paper gives a short introduction into the possibility of detecting foreign bodies in food by using IR thermography. The first results shown for combinations of cherries and chocolate and berries contaminated with leaves, stalks, pedicel and thorns could be easily evaluated manually. Therefore the differing emissivity coefficients or the different heat conductivities and/or capacities are used for differentiation. Applying pulse thermography, first heat conductivity measurements of different food materials are performed. Calculating the contrast of possible food / contaminant combinations shows the difficulty of differentiating certain materials. A possible automatic evaluation for raisins contaminated with wooden sticks and almonds blended with stones could be shown. The power of special adapted algorithms using statistical or morphological analysis is shown to distinguish the foreign bodies from the foodstuff.

  13. Expression of Caspase-1 in breast cancer tissues and its effects on cell proliferation, apoptosis and invasion.

    PubMed

    Sun, Yanxia; Guo, Yingzhen

    2018-05-01

    The present study aimed to detect the expression of Caspase-1 in the tumor tissues and tumor-adjacent tissues of patients with breast cancer, and to investigate the effects of Caspase-1 on the proliferation, apoptosis and invasion of breast cancer cells. Reverse transcription-quantitative polymerase chain reaction was used to detect Caspase-1 mRNA expression in breast cancer tissues and tumor-adjacent tissues from patients. Additionally, the human breast cancer MDA-MB-231 cell line was treated with the Caspase-1 small molecule inhibitor Ac-YVAD-CMK, following which the changes to Caspase-1 protein expression were detected via western blotting. The MTT method detected the changes to cell proliferation, flow cytometry detected the rate of apoptosis, and a Transwell assay was employed to assess invasion. Caspase-1 mRNA expression was significantly decreased in the breast cancer tissues of patients, compared with in the tumor-adjacent tissues, a difference that was statistically significant (P<0.05). Treatment with the Ac-YVAD-CMK markedly decreased the protein expression of Caspase-1 in MDA-MB-231 cells, and the difference was statistically significant (P<0.05). Following this treatment of Ac-YVAD-CMK cells, the proliferation and invasion abilities markedly increased, while the apoptotic levels significantly decreased (P<0.05). In conclusion, the expression of Caspase-1 is low in breast cancer tissues, which may promote the proliferation and invasion of breast cancer cells and could be closely associated with the occurrence and development of breast cancer.

  14. A spatial scan statistic for multiple clusters.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2011-10-01

    Spatial scan statistics are commonly used for geographical disease surveillance and cluster detection. While there are multiple clusters coexisting in the study area, they become difficult to detect because of clusters' shadowing effect to each other. The recently proposed sequential method showed its better power for detecting the second weaker cluster, but did not improve the ability of detecting the first stronger cluster which is more important than the second one. We propose a new extension of the spatial scan statistic which could be used to detect multiple clusters. Through constructing two or more clusters in the alternative hypothesis, our proposed method accounts for other coexisting clusters in the detecting and evaluating process. The performance of the proposed method is compared to the sequential method through an intensive simulation study, in which our proposed method shows better power in terms of both rejecting the null hypothesis and accurately detecting the coexisting clusters. In the real study of hand-foot-mouth disease data in Pingdu city, a true cluster town is successfully detected by our proposed method, which cannot be evaluated to be statistically significant by the standard method due to another cluster's shadowing effect. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. What do results from coordinate-based meta-analyses tell us?

    PubMed

    Albajes-Eizagirre, Anton; Radua, Joaquim

    2018-08-01

    Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Detecting Seismic Activity with a Covariance Matrix Analysis of Data Recorded on Seismic Arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N.; de Rosny, J.; Brenguier, F.

    2014-12-01

    Modern seismic networks are recording the ground motion continuously all around the word, with very broadband and high-sensitivity sensors. The aim of our study is to apply statistical array-based approaches to processing of these records. We use the methods mainly brought from the random matrix theory in order to give a statistical description of seismic wavefields recorded at the Earth's surface. We estimate the array covariance matrix and explore the distribution of its eigenvalues that contains information about the coherency of the sources that generated the studied wavefields. With this approach, we can make distinctions between the signals generated by isolated deterministic sources and the "random" ambient noise. We design an algorithm that uses the distribution of the array covariance matrix eigenvalues to detect signals corresponding to coherent seismic events. We investigate the detection capacity of our methods at different scales and in different frequency ranges by applying it to the records of two networks: (1) the seismic monitoring network operating on the Piton de la Fournaise volcano at La Réunion island composed of 21 receivers and with an aperture of ~15 km, and (2) the transportable component of the USArray composed of ~400 receivers with ~70 km inter-station spacing.

  17. Unicorns or Tiger Woods: are lie detection experts myths or rarities? A response to on lie detection "wizards" by Bond and Uysal.

    PubMed

    O'Sullivan, Maureen

    2007-02-01

    Bond and Uysal (this issue) complain that expert lie detectors identified by O'Sullivan and Ekman (2004) are statistical flukes. They ignore one class of experts we have identified and misrepresent the procedures we use to identify the others. They also question the psychometric validity of the measures and protocol used. Many of their points are addressed in the chapter they criticize. The fruitfulness of the O'Sullivan-Ekman protocol is illustrated with respect to improved identification of expert lie detectors, as well as a replicated pattern of errors made by experts from different professional groups. The statistical arguments offered confuse the theoretical use of the binomial with the empirical use of the normal distribution. Data are provided that may clarify this distinction.

  18. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  19. Transport of nitrogen oxides, carbon monoxide and ozone to the Alpine Global Atmosphere Watch stations Jungfraujoch (Switzerland), Zugspitze and Hohenpeissenberg (Germany), Sonnblick (Austria) and Mt. Krvavec (Slovenia)

    NASA Astrophysics Data System (ADS)

    Kaiser, August; Scheifinger, Helfried; Spangl, Wolfgang; Weiss, Andrea; Gilge, Stefan; Fricke, Wolfgang; Ries, Ludwig; Cemas, Danijel; Jesenovec, Brigita

    The Alpine stations Zugspitze, Hohenpeissenberg, Sonnblick, Jungfraujoch and Mt. Krvavec contribute to the Global Atmosphere Watch Programme (GAW) of the World Meteorological Organization (WMO). The aim of GAW is the surveillance of the large-scale chemical composition of the atmosphere. Thus, the detection of air pollutant transport from regional sources is of particular interest. In this paper, the origin of NO x (measured with a photo-converter), CO and O 3 at the four Alpine GAW stations is studied by trajectory residence time statistics. Although these methods originated during the early 1980s, no comprehensive study of different atmospheric trace gases measured simultaneously at several background observatories in the Alps was conducted up to present. The main NO x source regions detected by the trajectory statistics are the northwest of Europe and the region covering East Germany, Czech Republic and southeast Poland, whereas the main CO source areas are the central, north eastern and eastern parts of Europe with some gradient from low to high latitudes. Subsiding air masses from west and southwest are relatively poor in NO x and CO. The statistics for ozone show strong seasonal effects. Near ground air masses are poor in ozone in winter but rich in ozone in summer. The main source for high ozone concentration in winter is air masses that subside from higher elevations, often enhanced by foehn effects at Hohenpeissenberg. During summer, the Mediterranean constitutes an important additional source for high ozone concentrations. Especially during winter, large differences between Hohenpeissenberg and the higher elevated stations are found. Hohenpeissenberg is frequently within the inversion, whereas the higher elevated stations are above the inversion. Jungfraujoch is the only station where the statistics detect an influence of air rich in CO and NO x from the Po Basin.

  20. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study

    PubMed Central

    Clark, Matthew T.; Calland, James Forrest; Enfield, Kyle B.; Voss, John D.; Lake, Douglas E.; Moorman, J. Randall

    2017-01-01

    Background Charted vital signs and laboratory results represent intermittent samples of a patient’s dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. Methods and findings We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Conclusions Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs. PMID:28771487

  1. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study.

    PubMed

    Moss, Travis J; Clark, Matthew T; Calland, James Forrest; Enfield, Kyle B; Voss, John D; Lake, Douglas E; Moorman, J Randall

    2017-01-01

    Charted vital signs and laboratory results represent intermittent samples of a patient's dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs.

  2. Technology-based counseling in the management of weight and lifestyles of obese or overweight children and adolescents: A descriptive systematic literature review.

    PubMed

    Kaakinen, Pirjo; Kyngäs, Helvi; Kääriäinen, Maria

    2018-03-01

    The number of overweight and obese children and adolescents has increased worldwide. Obese children and adolescents need counseling interventions, including technology-based methods, to help them manage their weight by changing their lifestyles. To describe technology-based counseling interventions in supporting obese or overweight children and adolescents to change their weight/lifestyle. Descriptive systematic literature review. A literature search was conducted using Cinahl, Medline, PsycINFO, and Medic databases in September 2010 and updated in January 2015. Predefined inclusion criteria were used for the search. After a quality assessment, 28 studies were included in the data extraction. No statistically significant difference in BMI was detected between the intervention and control groups. However, in some studies, it was found that BMI decreases and there were statistically significant differences in fruit and vegetable consumption. In two studies, differences in physical activity were detected between the intervention and control groups, but in eight studies, the difference was not significant. Goal setting and feedback on progress support physical activity and changes in diet. This study identifies available technology interventions for obese or overweight children and adolescents. It seems that using technology-based counseling intervention may encourage obese and overweight children and adolescents to pursue a healthier lifestyle.

  3. Impact of molecular mechanisms, including deletion size, on Prader-Willi syndrome phenotype: study of 75 patients.

    PubMed

    Varela, M C; Kok, F; Setian, N; Kim, C A; Koiffmann, C P

    2005-01-01

    Prader-Willi syndrome (PWS) can result from a 15q11-q13 paternal deletion, maternal uniparental disomy (UPD), or imprinting mutations. We describe here the phenotypic variability detected in 51 patients with different types of deletions and 24 patients with UPD. Although no statistically significant differences could be demonstrated between the two main types of PWS deletion patients, it was observed that type I (BP1-BP3) patients acquired speech later than type II (BP2-BP3) patients. Comparing the clinical pictures of our patients with UPD with those with deletions, we found that UPD children presented with lower birth length and started walking earlier and deletion patients presented with a much higher incidence of seizures than UPD patients. In addition, the mean maternal age in the UPD group was higher than in the deletion group. No statistically significant differences could be demonstrated between the deletion and the UPD group with respect to any of the major features of PWS. In conclusion, our study did not detect significant phenotypic differences among type I and type II PWS deletion patients, but it did demonstrate that seizures were six times more common in patients with a deletion than in those with UPD.

  4. Psoriasis and wound healing outcomes: A retrospective cohort study examining wound complications and antibiotic use.

    PubMed

    Young, Paulina M; Parsi, Kory K; Schupp, Clayton W; Armstrong, April W

    2017-11-15

    Little is known about wound healing in psoriasis. We performed a cohort study examining differences in wound healing complications between patients with and without psoriasis. Psoriasis patients with traumatic wounds were matched 1:3 to non-psoriasis patients with traumatic wounds based on age, gender, and body mass index (BMI). We examined theincidence of wound complications including infection, necrosis, and hematoma as well as incident antibiotic use within three months following diagnosis of a traumatic wound. The study included 164 patients with traumatic wounds, comprised of 41 patients with psoriasis matched to 123 patients without psoriasis. No statistically significant differences were detected in the incidence of overall wound complications between wound patients with psoriasis and wound patients without psoriasis (14.6% versus. 13.0%, HR 1.18, CI 0.39-3.56). After adjustment for diabetes, peripheral vascular disease, and smoking, no statistically significant differences were detected in the incidence of overall wound complications between patients with and without psoriasis (HR 1.11, CI 0.34-3.58). Specifically, the adjusted rates of antibiotic use were not significantly different between those with and without psoriasis (HR 0.65, CI 0.29-1.46). The incidence of wound complications following traumatic wounds of the skin was found to be similar between patients with and without psoriasis.

  5. Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test

    PubMed Central

    2012-01-01

    Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season) and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season), 38−50 (2009-2010 season), weeks 50−9 (2010-2011 season) and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could be applied to other data sets to quickly detect influenza outbreaks. The sequential structure of the test makes it suitable for implementation in many platforms at a low computational cost without requiring to store large data sets. PMID:23031321

  6. [Gender-sensitive epidemiological data analysis: methodological aspects and empirical outcomes. Illustrated by a health reporting example].

    PubMed

    Jahn, I; Foraita, R

    2008-01-01

    In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.

  7. [Expression of mRNA and protein of p38, Osx, PI3K and Akt1 in rat bone with chronic fluorosis].

    PubMed

    Yu, Yan-ni; Yang, Dan; Zhu, Hai-zhen; Deng, Chao-nan; Guan, Zhi-zhong

    2012-09-01

    To investigate the expressions of mRNA and protein of p38, Osx, PI3K, Akt1 in the rats bone with chronic fluorosis. Dental fluorosis were observed and the fluoride contents in the urine and bone were detected by fluorin-ion selective electrode. The morphologic changes and ultrastructure of rats' bone were observed by light and electronic microscopy. The expressions of protein and mRNA of p38, Osx, PI3K and Akt1 were detected by immunohistochemistry and real-time PCR, respectively. The contents of BALP and BGP in serum were detected by ELISA. The rates of dental fluorosis in the fluorosis rats were increased, and the fluoride contents in bone and urine of the fluorosis rats were increased compared to the control group, the difference was statistically significant (P < 0.05). The bone trabeculae thickness and density and the thickness of bone cortex in fluorosis rats were remarkably increased, the space of bone trabeculae was reduced, and in accordance with the matching morphometrical indices, the difference was statistically significant (P < 0.05) as compared with the control rats. The contents of BALP [(54.61 ± 2.27) U/L] and BGP [(2.38 ± 0.16) µg/L]in the fluoride groups were higher than those in the control group, the difference was statistically significant (P < 0.05). Ultrastructurally, the broadening of the osseouslacuna was observed. The reduced protuberances of the osteocytes, the unclear organelle structure, pyknosis, karyotheca increasation and edged chromatin were also observed. Compared to the control group, the expressions of protein and its mRNA of p38, Osx, PI3K and Akt1 were higher in the fluorosis rats than those in the control rats, and the difference was statistically significant (P < 0.05). There is no any expression of p38, Osx, PI3K and Akt1 in the osteocytes in fluorosis rats. The over-expression of p38, Osx, PI3K and Akt1 in bone tissue of fluorosis rats may relate to the accumulation of fluorine in the body. The bone injury mainly occur in the stage of the differentiation and proliferation. The upregulation of P38MARK signal path and PI3K/Akt1 signal path may be involved in the pathogenesis of bone injury caused by fluoride.

  8. Detection of nonlinear transfer functions by the use of Gaussian statistics

    NASA Technical Reports Server (NTRS)

    Sheppard, J. G.

    1972-01-01

    The possibility of using on-line signal statistics to detect electronic equipment nonlinearities is discussed. The results of an investigation using Gaussian statistics are presented, and a nonlinearity test that uses ratios of the moments of a Gaussian random variable is developed and discussed. An outline for further investigation is presented.

  9. The use of the temporal scan statistic to detect methicillin-resistant Staphylococcus aureus clusters in a community hospital.

    PubMed

    Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott

    2014-07-08

    In healthcare facilities, conventional surveillance techniques using rule-based guidelines may result in under- or over-reporting of methicillin-resistant Staphylococcus aureus (MRSA) outbreaks, as these guidelines are generally unvalidated. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting MRSA clusters, validate clusters using molecular techniques and hospital records, and determine significant differences in the rate of MRSA cases using regression models. Patients admitted to a community hospital between August 2006 and February 2011, and identified with MRSA>48 hours following hospital admission, were included in this study. Between March 2010 and February 2011, MRSA specimens were obtained for spa typing. MRSA clusters were investigated using a retrospective temporal scan statistic. Tests were conducted on a monthly scale and significant clusters were compared to MRSA outbreaks identified by hospital personnel. Associations between the rate of MRSA cases and the variables year, month, and season were investigated using a negative binomial regression model. During the study period, 735 MRSA cases were identified and 167 MRSA isolates were spa typed. Nine different spa types were identified with spa type 2/t002 (88.6%) the most prevalent. The temporal scan statistic identified significant MRSA clusters at the hospital (n=2), service (n=16), and ward (n=10) levels (P ≤ 0.05). Seven clusters were concordant with nine MRSA outbreaks identified by hospital staff. For the remaining clusters, seven events may have been equivalent to true outbreaks and six clusters demonstrated possible transmission events. The regression analysis indicated years 2009-2011, compared to 2006, and months March and April, compared to January, were associated with an increase in the rate of MRSA cases (P ≤ 0.05). The application of the temporal scan statistic identified several MRSA clusters that were not detected by hospital personnel. The identification of specific years and months with increased MRSA rates may be attributable to several hospital level factors including the presence of other pathogens. Within hospitals, the incorporation of the temporal scan statistic to standard surveillance techniques is a valuable tool for healthcare workers to evaluate surveillance strategies and aid in the identification of MRSA clusters.

  10. Generating survival times to simulate Cox proportional hazards models with time-varying covariates.

    PubMed

    Austin, Peter C

    2012-12-20

    Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data-generating process: one must be able to simulate data from a specified statistical model. We describe data-generating processes for the Cox proportional hazards model with time-varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time-varying covariates: first, a dichotomous time-varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time-varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time-varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed-form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time-invariant covariates and to a single time-varying covariate. We illustrate the utility of our closed-form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time-varying covariates. This is compared with the statistical power to detect as statistically significant a binary time-invariant covariate. Copyright © 2012 John Wiley & Sons, Ltd.

  11. [Establishment of animal model for Pneumocystis carinii and study on etiological and molecular biological detection technology].

    PubMed

    Tian, Li-guang; Ai, Lin; Chu, Yan-hong; Wu, Xiu-ping; Cai, Yu-chun; Chen, Zhuo; Chen, Shao-hong; Chen, Jia-xu

    2015-04-01

    To establish an animal model for Pneumocystis pneumonia (PCP) and to study the etiological and molecular biological technology for PCP detection. SD and Wistar rats were divided into experimental and control groups randomly. The animals in the experimental group were immunosuppressed by subcutaneous injection with dexamethasone 2 mg per time per rat, twice a week, while those in the control group underwent the same way of injection with physiological saline simultaneously. After the induction for 8 weeks, all the rats were killed and their bronchoalveolar lavage fluid (BALF) and lung tissues were collected for smear making and microscopic detection. Meanwhile, the BALF samples were detected by PCR, and the products were sequenced and compared with rat source PCP in GenBank. A total of 34 samples of lung tissue and BALF were observed. The etiological detection showed that the infection rates of the rats in the experimental and control groups were 29.2% (7/24) and 0, respectively. In the experimental group, the infection rates of SD and Wistar rats were 25.0% (3/12) and 33.3% (4/12), respectively, and the difference between them was not statistically significant (P = 0.31). The positive detection rates of the lung smears and BALF from SD rats in the experimental group were 25.0% (3/12) and 16.7% (2/12), respectively, while those in Wistar rats in the experimental group were 33.3% (4/12) and 16.7% (2/12), respectively, and there were no statistically significant difference between them (P = 0.34, 0.24). A total of 28 samples of BALF were detected by PCR, and the positive detection rates of rats in the experimental group and control group were 91.7% (26/28) and 0, respectively. The sequence analysis of the PCR products showed that it shared 100% homology with the genes of rat source PCP in Gen Bank (JX499145, GU133622 and EF646865). The animal model of PCP can be established by subcutaneous injection with dexamethasone. As animal models, there are no significant difference between SD rats and Wistar rats. PCR method is suitable for PCP detection at the early stage of infection, while etiological detection with high missing rate is not a right option.

  12. Bayesian analyses of time-interval data for environmental radiation monitoring.

    PubMed

    Luo, Peng; Sharp, Julia L; DeVol, Timothy A

    2013-01-01

    Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.

  13. Mite fauna and fungal flora in house dust from homes of asthmatic children.

    PubMed

    Ishii, A; Takaoka, M; Ichinoe, M; Kabasawa, Y; Ouchi, T

    1979-12-01

    Mite fauna and fungal flora in the house dust from homes of asthmatic children with positive and negative skin test to house dust allergen and non-asthmatic controls were examined. There was no conspicuous difference in mite species distribution among the three groups. Pyroglyphid mites dominate the mite fauna in house dust more than half of which being Dermatophagoides: D. pteronyssinus and D. farinae. There was no statistically significant difference in numbers between the two species and either species could dominate depending on the conditions of the individual houses. The average number of acarina in 0.5 g of fine dust did not differ statistically among the three groups; however, mite number per square meter floor differed between patients with positive skin test and negative skin test. The results suggest that house-cleaning might influence the possible sensitization of children. The genetic distribution of mould fungi in house dust was largely similar to that of airborne fungi. The average number of fungal colonies detected in 0.5 g of dust did not differ statistically among the three groups. Wallemia with its minute spores may cause sensitization but has so far been insufficiently investigated.

  14. ROC evaluation of SPECT myocardial lesion detectability with and without single iteration non-uniform Chang attenuation compensation using an anthropomorphic female phantom

    NASA Astrophysics Data System (ADS)

    Jang, Sunyoung; Jaszczak, R. J.; Tsui, B. M. W.; Metz, C. E.; Gilland, D. R.; Turkington, T. G.; Coleman, R. E.

    1998-08-01

    The purpose of this work was to evaluate lesion detectability with and without nonuniform attenuation compensation (AC) in myocardial perfusion SPECT imaging in women using an anthropomorphic phantom and receiver operating characteristics (ROC) methodology. Breast attenuation causes artifacts in reconstructed images and may increase the difficulty of diagnosis of myocardial perfusion imaging in women. The null hypothesis tested using the ROC study was that nonuniform AC does not change the lesion detectability in myocardial perfusion SPECT imaging in women. The authors used a filtered backprojection (FBP) reconstruction algorithm and Chang's (1978) single iteration method for AC. In conclusion, with the authors' proposed myocardial defect model nuclear medicine physicians demonstrated no significant difference for the detection of the anterior wall defect; however, a greater accuracy for the detection of the inferior wall defect was observed without nonuniform AC than with it (P-value=0.0034). Medical physicists did not demonstrate any statistically significant difference in defect detection accuracy with or without nonuniform AC in the female phantom.

  15. A Multiscale pipeline for the search of string-induced CMB anisotropies

    NASA Astrophysics Data System (ADS)

    Vafaei Sadr, A.; Movahed, S. M. S.; Farhang, M.; Ringeval, C.; Bouchet, F. R.

    2018-03-01

    We propose a multiscale edge-detection algorithm to search for the Gott-Kaiser-Stebbins imprints of a cosmic string (CS) network on the cosmic microwave background (CMB) anisotropies. Curvelet decomposition and extended Canny algorithm are used to enhance the string detectability. Various statistical tools are then applied to quantify the deviation of CMB maps having a CS contribution with respect to pure Gaussian anisotropies of inflationary origin. These statistical measures include the one-point probability density function, the weighted two-point correlation function (TPCF) of the anisotropies, the unweighted TPCF of the peaks and of the up-crossing map, as well as their cross-correlation. We use this algorithm on a hundred of simulated Nambu-Goto CMB flat sky maps, covering approximately 10 per cent of the sky, and for different string tensions Gμ. On noiseless sky maps with an angular resolution of 0.9 arcmin, we show that our pipeline detects CSs with Gμ as low as Gμ ≳ 4.3 × 10-10. At the same resolution, but with a noise level typical to a CMB-S4 phase II experiment, the detection threshold would be to Gμ ≳ 1.2 × 10-7.

  16. A new 2D segmentation method based on dynamic programming applied to computer aided detection in mammography.

    PubMed

    Timp, Sheila; Karssemeijer, Nico

    2004-05-01

    Mass segmentation plays a crucial role in computer-aided diagnosis (CAD) systems for classification of suspicious regions as normal, benign, or malignant. In this article we present a robust and automated segmentation technique--based on dynamic programming--to segment mass lesions from surrounding tissue. In addition, we propose an efficient algorithm to guarantee resulting contours to be closed. The segmentation method based on dynamic programming was quantitatively compared with two other automated segmentation methods (region growing and the discrete contour model) on a dataset of 1210 masses. For each mass an overlap criterion was calculated to determine the similarity with manual segmentation. The mean overlap percentage for dynamic programming was 0.69, for the other two methods 0.60 and 0.59, respectively. The difference in overlap percentage was statistically significant. To study the influence of the segmentation method on the performance of a CAD system two additional experiments were carried out. The first experiment studied the detection performance of the CAD system for the different segmentation methods. Free-response receiver operating characteristics analysis showed that the detection performance was nearly identical for the three segmentation methods. In the second experiment the ability of the classifier to discriminate between malignant and benign lesions was studied. For region based evaluation the area Az under the receiver operating characteristics curve was 0.74 for dynamic programming, 0.72 for the discrete contour model, and 0.67 for region growing. The difference in Az values obtained by the dynamic programming method and region growing was statistically significant. The differences between other methods were not significant.

  17. Statistical evaluation of variables affecting occurrence of hydrocarbons in aquifers used for public supply, California

    USGS Publications Warehouse

    Landon, Matthew K.; Burton, Carmen A.; Davis, Tracy A.; Belitz, Kenneth; Johnson, Tyler D.

    2014-01-01

    The variables affecting the occurrence of hydrocarbons in aquifers used for public supply in California were assessed based on statistical evaluation of three large statewide datasets; gasoline oxygenates also were analyzed for comparison with hydrocarbons. Benzene is the most frequently detected (1.7%) compound among 17 hydrocarbons analyzed at generally low concentrations (median detected concentration 0.024 μg/l) in groundwater used for public supply in California; methyl tert-butyl ether (MTBE) is the most frequently detected (5.8%) compound among seven oxygenates analyzed (median detected concentration 0.1 μg/l). At aquifer depths used for public supply, hydrocarbons and MTBE rarely co-occur and are generally related to different variables; in shallower groundwater, co-occurrence is more frequent and there are similar relations to the density or proximity of potential sources. Benzene concentrations are most strongly correlated with reducing conditions, regardless of groundwater age and depth. Multiple lines of evidence indicate that benzene and other hydrocarbons detected in old, deep, and/or brackish groundwater result from geogenic sources of oil and gas. However, in recently recharged (since ~1950), generally shallower groundwater, higher concentrations and detection frequencies of benzene and hydrocarbons were associated with a greater proportion of commercial land use surrounding the well, likely reflecting effects of anthropogenic sources, particularly in combination with reducing conditions.

  18. Maintenance energy requirements of odor detection, explosive detection and human detection working dogs.

    PubMed

    Mullis, Rebecca A; Witzel, Angela L; Price, Joshua

    2015-01-01

    Despite their important role in security, little is known about the energy requirements of working dogs such as odor, explosive and human detection dogs. Previous researchers have evaluated the energy requirements of individual canine breeds as well as dogs in exercise roles such as sprint racing. This study is the first to evaluate the energy requirements of working dogs trained in odor, explosive and human detection. This retrospective study evaluated twenty adult dogs who maintained consistent body weights over a six month period. During this time, the average energy consumption was [Formula: see text] or two times the calculated resting energy requirement ([Formula: see text]). No statistical differences were found between breeds, age or sex, but a statistically significant association (p = 0.0033, R-square = 0.0854) was seen between the number of searches a dog performs and their energy requirement. Based on this study's population, it appears that working dogs have maintenance energy requirements similar to the 1974 National Research Council's (NRC) maintenance energy requirement of [Formula: see text] (National Research Council (NRC), 1974) and the [Formula: see text] reported for young laboratory beagles (Rainbird & Kienzle, 1990). Additional research is needed to determine if these data can be applied to all odor, explosive and human detection dogs and to determine if other types of working dogs (tracking, search and rescue etc.) have similar energy requirements.

  19. Nonparametric rank regression for analyzing water quality concentration data with multiple detection limits.

    PubMed

    Fu, Liya; Wang, You-Gan

    2011-02-15

    Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which clearly demonstrates the advantages of the rank regression models.

  20. Relative risk estimates from spatial and space-time scan statistics: Are they biased?

    PubMed Central

    Prates, Marcos O.; Kulldorff, Martin; Assunção, Renato M.

    2014-01-01

    The purely spatial and space-time scan statistics have been successfully used by many scientists to detect and evaluate geographical disease clusters. Although the scan statistic has high power in correctly identifying a cluster, no study has considered the estimates of the cluster relative risk in the detected cluster. In this paper we evaluate whether there is any bias on these estimated relative risks. Intuitively, one may expect that the estimated relative risks has upward bias, since the scan statistic cherry picks high rate areas to include in the cluster. We show that this intuition is correct for clusters with low statistical power, but with medium to high power the bias becomes negligible. The same behaviour is not observed for the prospective space-time scan statistic, where there is an increasing conservative downward bias of the relative risk as the power to detect the cluster increases. PMID:24639031

  1. A hint of Poincaré dodecahedral topology in the WMAP first year sky map

    NASA Astrophysics Data System (ADS)

    Roukema, B. F.; Lew, B.; Cechowska, M.; Marecki, A.; Bajtlik, S.

    2004-09-01

    It has recently been suggested by Luminet et al. (\\cite{LumNat03}) that the WMAP data are better matched by a geometry in which the topology is that of a Poincaré dodecahedral model and the curvature is ``slightly'' spherical, rather than by an (effectively) infinite flat model. A general back-to-back matched circles analysis by Cornish et al. (\\cite{CSSK03}) for angular radii in the range 25-90 °, using a correlation statistic for signal detection, failed to support this. In this paper, a matched circles analysis specifically designed to detect dodecahedral patterns of matched circles is performed over angular radii in the range 1-40\\ddeg on the one-year WMAP data. Signal detection is attempted via a correlation statistic and an rms difference statistic. Extreme value distributions of these statistics are calculated for one orientation of the 36\\ddeg ``screw motion'' (Clifford translation) when matching circles, for the opposite screw motion, and for a zero (unphysical) rotation. The most correlated circles appear for circle radii of \\alpha =11 ± 1 \\ddeg, for the left-handed screw motion, but not for the right-handed one, nor for the zero rotation. The favoured six dodecahedral face centres in galactic coordinates are (\\lII,\\bII) ≈ (252\\ddeg,+65\\ddeg), (51\\ddeg,+51\\ddeg), (144\\ddeg,+38\\ddeg), (207\\ddeg,+10\\ddeg), (271\\ddeg,+3\\ddeg), (332\\ddeg,+25\\ddeg) and their opposites. The six pairs of circles independently each favour a circle angular radius of 11 ± 1\\ddeg. The temperature fluctuations along the matched circles are plotted and are clearly highly correlated. Whether or not these six circle pairs centred on dodecahedral faces match via a 36\\ddeg rotation only due to unexpected statistical properties of the WMAP ILC map, or whether they match due to global geometry, it is clear that the WMAP ILC map has some unusual statistical properties which mimic a potentially interesting cosmological signal.

  2. Detecting Abrupt Changes in a Piecewise Locally Stationary Time Series

    PubMed Central

    Last, Michael; Shumway, Robert

    2007-01-01

    Non-stationary time series arise in many settings, such as seismology, speech-processing, and finance. In many of these settings we are interested in points where a model of local stationarity is violated. We consider the problem of how to detect these change-points, which we identify by finding sharp changes in the time-varying power spectrum. Several different methods are considered, and we find that the symmetrized Kullback-Leibler information discrimination performs best in simulation studies. We derive asymptotic normality of our test statistic, and consistency of estimated change-point locations. We then demonstrate the technique on the problem of detecting arrival phases in earthquakes. PMID:19190715

  3. Seven ways to increase power without increasing N.

    PubMed

    Hansen, W B; Collins, L M

    1994-01-01

    Many readers of this monograph may wonder why a chapter on statistical power was included. After all, by now the issue of statistical power is in many respects mundane. Everyone knows that statistical power is a central research consideration, and certainly most National Institute on Drug Abuse grantees or prospective grantees understand the importance of including a power analysis in research proposals. However, there is ample evidence that, in practice, prevention researchers are not paying sufficient attention to statistical power. If they were, the findings observed by Hansen (1992) in a recent review of the prevention literature would not have emerged. Hansen (1992) examined statistical power based on 46 cohorts followed longitudinally, using nonparametric assumptions given the subjects' age at posttest and the numbers of subjects. Results of this analysis indicated that, in order for a study to attain 80-percent power for detecting differences between treatment and control groups, the difference between groups at posttest would need to be at least 8 percent (in the best studies) and as much as 16 percent (in the weakest studies). In order for a study to attain 80-percent power for detecting group differences in pre-post change, 22 of the 46 cohorts would have needed relative pre-post reductions of greater than 100 percent. Thirty-three of the 46 cohorts had less than 50-percent power to detect a 50-percent relative reduction in substance use. These results are consistent with other review findings (e.g., Lipsey 1990) that have shown a similar lack of power in a broad range of research topics. Thus, it seems that, although researchers are aware of the importance of statistical power (particularly of the necessity for calculating it when proposing research), they somehow are failing to end up with adequate power in their completed studies. This chapter argues that the failure of many prevention studies to maintain adequate statistical power is due to an overemphasis on sample size (N) as the only, or even the best, way to increase statistical power. It is easy to see how this overemphasis has come about. Sample size is easy to manipulate, has the advantage of being related to power in a straight-forward way, and usually is under the direct control of the researcher, except for limitations imposed by finances or subject availability. Another option for increasing power is to increase the alpha used for hypothesis-testing but, as very few researchers seriously consider significance levels much larger than the traditional .05, this strategy seldom is used. Of course, sample size is important, and the authors of this chapter are not recommending that researchers cease choosing sample sizes carefully. Rather, they argue that researchers should not confine themselves to increasing N to enhance power. It is important to take additional measures to maintain and improve power over and above making sure the initial sample size is sufficient. The authors recommend two general strategies. One strategy involves attempting to maintain the effective initial sample size so that power is not lost needlessly. The other strategy is to take measures to maximize the third factor that determines statistical power: effect size.

  4. Evaluation of the Gini Coefficient in Spatial Scan Statistics for Detecting Irregularly Shaped Clusters

    PubMed Central

    Kim, Jiyu; Jung, Inkyung

    2017-01-01

    Spatial scan statistics with circular or elliptic scanning windows are commonly used for cluster detection in various applications, such as the identification of geographical disease clusters from epidemiological data. It has been pointed out that the method may have difficulty in correctly identifying non-compact, arbitrarily shaped clusters. In this paper, we evaluated the Gini coefficient for detecting irregularly shaped clusters through a simulation study. The Gini coefficient, the use of which in spatial scan statistics was recently proposed, is a criterion measure for optimizing the maximum reported cluster size. Our simulation study results showed that using the Gini coefficient works better than the original spatial scan statistic for identifying irregularly shaped clusters, by reporting an optimized and refined collection of clusters rather than a single larger cluster. We have provided a real data example that seems to support the simulation results. We think that using the Gini coefficient in spatial scan statistics can be helpful for the detection of irregularly shaped clusters. PMID:28129368

  5. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    PubMed

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  6. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution

    PubMed Central

    Gangnon, Ronald E.

    2011-01-01

    Summary The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, while rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. PMID:21762118

  7. Performance of DIMTEST-and NOHARM-Based Statistics for Testing Unidimensionality

    ERIC Educational Resources Information Center

    Finch, Holmes; Habing, Brian

    2007-01-01

    This Monte Carlo study compares the ability of the parametric bootstrap version of DIMTEST with three goodness-of-fit tests calculated from a fitted NOHARM model to detect violations of the assumption of unidimensionality in testing data. The effectiveness of the procedures was evaluated for different numbers of items, numbers of examinees,…

  8. A Procedure To Detect Test Bias Present Simultaneously in Several Items.

    ERIC Educational Resources Information Center

    Shealy, Robin; Stout, William

    A statistical procedure is presented that is designed to test for unidirectional test bias existing simultaneously in several items of an ability test, based on the assumption that test bias is incipient within the two groups' ability differences. The proposed procedure--Simultaneous Item Bias (SIB)--is based on a multidimensional item response…

  9. A Monte Carlo Approach to Unidimensionality Testing in Polytomous Rasch Models

    ERIC Educational Resources Information Center

    Christensen, Karl Bang; Kreiner, Svend

    2007-01-01

    Many statistical tests are designed to test the different assumptions of the Rasch model, but only few are directed at detecting multidimensionality. The Martin-Lof test is an attractive approach, the disadvantage being that its null distribution deviates strongly from the asymptotic chi-square distribution for most realistic sample sizes. A Monte…

  10. Outlier Detection in High-Stakes Certification Testing.

    ERIC Educational Resources Information Center

    Meijer, Rob R.

    2002-01-01

    Used empirical data from a certification test to study methods from statistical process control that have been proposed to classify an item score pattern as fitting or misfitting the underlying item response theory model in computerized adaptive testing. Results for 1,392 examinees show that different types of misfit can be distinguished. (SLD)

  11. Adaptive variation in Pinus ponderosa from Intermountain regions. II. Middle Columbia River system

    Treesearch

    Gerald Rehfeldt

    1986-01-01

    Seedling populations were grown and compared in common environments. Statistical analyses detected genetic differences between populations for numerous traits reflecting growth potential and periodicity of shoot elongation. Multiple regression models described an adaptive landscape in which populations from low elevations have a high growth potential while those from...

  12. A Comparison of Lord's Chi Square and Raju's Area Measures in Detection of DIF.

    ERIC Educational Resources Information Center

    Cohen, Allan S.; Kim, Seock-Ho

    1993-01-01

    The effectiveness of two statistical tests of the area between item response functions (exact signed area and exact unsigned area) estimated in different samples, a measure of differential item functioning (DIF), was compared with Lord's chi square. Lord's chi square was found the most effective in determining DIF. (SLD)

  13. Photoresist thin-film effects on alignment process capability

    NASA Astrophysics Data System (ADS)

    Flores, Gary E.; Flack, Warren W.

    1993-08-01

    Two photoresists were selected for alignment characterization based on their dissimilar coating properties and observed differences on alignment capability. The materials are Dynachem OFPR-800 and Shipley System 8. Both photoresists were examined on two challenging alignment levels in a submicron CMOS process, a nitride level and a planarized second level metal. An Ultratech Stepper model 1500 which features a darkfield alignment system with a broadband green light for alignment signal detection was used for this project. Initially, statistically designed linear screening experiments were performed to examine six process factors for each photoresist: viscosity, spin acceleration, spin speed, spin time, softbake time, and softbake temperature. Using the results derived from the screening experiments, a more thorough examination of the statistically significant process factors was performed. A full quadratic experimental design was conducted to examine viscosity, spin speed, and spin time coating properties on alignment. This included a characterization of both intra and inter wafer alignment control and alignment process capability. Insight to the different alignment behavior is analyzed in terms of photoresist material properties and the physical nature of the alignment detection system.

  14. [Mechanism study on leptin resistance in lung cancer cachexia rats treated by Xiaoyan Decoction].

    PubMed

    Zhang, Yun-Chao; Jia, Ying-Jie; Yang, Pei-Ying; Zhang, Xing; Li, Xiao-Jiang; Zhang, Ying; Zhu, Jin-Li; Sun, Yi-Yu; Chen, Jun; Duan, Hao-Guo; Guo, Hua; Li, Chao

    2014-12-01

    To study the leptin resistance mechanism of Xiaoyan Decoction (XD) in lung cancer cachexia (LCC) rats. An LCC rat model was established. Totally 40 rats were randomly divided into the normal control group, the LCC model group, the XD group, and the positive control group, 10 in each group. After LCC model was set up, rats in the LCC model group were administered with normal saline, 2 mL each time. Rats in the XD group were administered with XD at the daily dose of 2 mL. Those in the positive control group were administered with Medroxyprogesterone Acetate suspension (20 mg/kg) by gastrogavage at the daily dose of 2 mL. All medication lasted for 14 days. The general condition and tumor growth were observed. Serum levels of leptin and leptin receptor in the hypothalamus were detected using enzyme-linked immunosorbent assay. Contents of neuropeptide Y (NPY) and anorexia for genomic POMC were detected using real-time PCR technique. Serum leptin levels were lower in the LCC model group than in the normal control group with statistical significance (P < 0.05). Compared with the LCC model groups, serum leptin levels significantly increased in the XD group (P < 0.01). Leptin receptor levels in the hypothalamus increased significantly in the LCC model group (P < 0.01). Increased receptor levels in the LCC model group indicated that either XD or Medroxyprogesterone Acetate could effectively reduce levels of leptin receptor with statistical significance (P < 0.01). There was also statistical difference between the XD group and the positive control group (P < 0.05). Contents of NPY was higher in the LCC model group than in the other groups with statistical difference (P < 0.05). There was no statistical difference in NPY between the normal control group and the rest 2 treatment groups (P > 0.05). There was statistical difference in POMC between the normal control group and the LCC model group (P < 0.05). POMC could be decreased in the XD group and the positive control group with statistical significance (P < 0.05), and it was more obviously decreased in the XD group (P < 0.05). Leptin resistance existed in LCC rats. XD could increase serum leptin levels and reduce leptin receptor levels in the hypothalamus. LCC could be improved by elevating NPY contents in the hypothalamus and reducing POMC contents, promoting the appetite, and increasing food intake from the periphery pathway and the central pathway.

  15. Simulated performance of an order statistic threshold strategy for detection of narrowband signals

    NASA Technical Reports Server (NTRS)

    Satorius, E.; Brady, R.; Deich, W.; Gulkis, S.; Olsen, E.

    1988-01-01

    The application of order statistics to signal detection is becoming an increasingly active area of research. This is due to the inherent robustness of rank estimators in the presence of large outliers that would significantly degrade more conventional mean-level-based detection systems. A detection strategy is presented in which the threshold estimate is obtained using order statistics. The performance of this algorithm in the presence of simulated interference and broadband noise is evaluated. In this way, the robustness of the proposed strategy in the presence of the interference can be fully assessed as a function of the interference, noise, and detector parameters.

  16. MIDAS: Regionally linear multivariate discriminative statistical mapping.

    PubMed

    Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos

    2018-07-01

    Statistical parametric maps formed via voxel-wise mass-univariate tests, such as the general linear model, are commonly used to test hypotheses about regionally specific effects in neuroimaging cross-sectional studies where each subject is represented by a single image. Despite being informative, these techniques remain limited as they ignore multivariate relationships in the data. Most importantly, the commonly employed local Gaussian smoothing, which is important for accounting for registration errors and making the data follow Gaussian distributions, is usually chosen in an ad hoc fashion. Thus, it is often suboptimal for the task of detecting group differences and correlations with non-imaging variables. Information mapping techniques, such as searchlight, which use pattern classifiers to exploit multivariate information and obtain more powerful statistical maps, have become increasingly popular in recent years. However, existing methods may lead to important interpretation errors in practice (i.e., misidentifying a cluster as informative, or failing to detect truly informative voxels), while often being computationally expensive. To address these issues, we introduce a novel efficient multivariate statistical framework for cross-sectional studies, termed MIDAS, seeking highly sensitive and specific voxel-wise brain maps, while leveraging the power of regional discriminant analysis. In MIDAS, locally linear discriminative learning is applied to estimate the pattern that best discriminates between two groups, or predicts a variable of interest. This pattern is equivalent to local filtering by an optimal kernel whose coefficients are the weights of the linear discriminant. By composing information from all neighborhoods that contain a given voxel, MIDAS produces a statistic that collectively reflects the contribution of the voxel to the regional classifiers as well as the discriminative power of the classifiers. Critically, MIDAS efficiently assesses the statistical significance of the derived statistic by analytically approximating its null distribution without the need for computationally expensive permutation tests. The proposed framework was extensively validated using simulated atrophy in structural magnetic resonance imaging (MRI) and further tested using data from a task-based functional MRI study as well as a structural MRI study of cognitive performance. The performance of the proposed framework was evaluated against standard voxel-wise general linear models and other information mapping methods. The experimental results showed that MIDAS achieves relatively higher sensitivity and specificity in detecting group differences. Together, our results demonstrate the potential of the proposed approach to efficiently map effects of interest in both structural and functional data. Copyright © 2018. Published by Elsevier Inc.

  17. Detecting changes in ultrasound backscattered statistics by using Nakagami parameters: Comparisons of moment-based and maximum likelihood estimators.

    PubMed

    Lin, Jen-Jen; Cheng, Jung-Yu; Huang, Li-Fei; Lin, Ying-Hsiu; Wan, Yung-Liang; Tsui, Po-Hsiang

    2017-05-01

    The Nakagami distribution is an approximation useful to the statistics of ultrasound backscattered signals for tissue characterization. Various estimators may affect the Nakagami parameter in the detection of changes in backscattered statistics. In particular, the moment-based estimator (MBE) and maximum likelihood estimator (MLE) are two primary methods used to estimate the Nakagami parameters of ultrasound signals. This study explored the effects of the MBE and different MLE approximations on Nakagami parameter estimations. Ultrasound backscattered signals of different scatterer number densities were generated using a simulation model, and phantom experiments and measurements of human liver tissues were also conducted to acquire real backscattered echoes. Envelope signals were employed to estimate the Nakagami parameters by using the MBE, first- and second-order approximations of MLE (MLE 1 and MLE 2 , respectively), and Greenwood approximation (MLE gw ) for comparisons. The simulation results demonstrated that, compared with the MBE and MLE 1 , the MLE 2 and MLE gw enabled more stable parameter estimations with small sample sizes. Notably, the required data length of the envelope signal was 3.6 times the pulse length. The phantom and tissue measurement results also showed that the Nakagami parameters estimated using the MLE 2 and MLE gw could simultaneously differentiate various scatterer concentrations with lower standard deviations and reliably reflect physical meanings associated with the backscattered statistics. Therefore, the MLE 2 and MLE gw are suggested as estimators for the development of Nakagami-based methodologies for ultrasound tissue characterization. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Fate of thiamethoxam in mesocosms and response of the zooplankton community.

    PubMed

    Lobson, C; Luong, K; Seburn, D; White, M; Hann, B; Prosser, R S; Wong, C S; Hanson, M L

    2018-05-14

    Thiamethoxam is a neonicotinoid insecticide that can reach wetlands in agro-ecosystems through runoff. The fate and effects of thiamethoxam on non-target organisms in shallow wetland ecosystems have not been well characterized. To this end, a mesocosm study was conducted with a focus on characterizing zooplankton community responses. A single pulse application of thiamethoxam (0, 25, 50, 100, 250, and 500 μg/L; n = 3) was applied to experimental systems and monitored for 8 weeks. The mean half-life of thiamethoxam among the different treatments was 3.7 days in the water column with concentrations of <0.8 μg/L in the majority of mesocosms by 56 days. Principal response curve analysis did not show any significant concentration-dependent differences in the zooplankton community among treatments over the course of the study. The minimum detectable difference (MDD%) values for abundance of potentially sensitive arthropod taxa (nauplius larvae, cyclopoid copepods) allowed the detections from controls as low as 42 and 59% effect, respectively. The MDD% values for total abundance of zooplankton (including the potentially less sensitive taxonomic group of Rotifera) allowed the detection from controls as low as 41% effect. There were no statistically significant differences in zooplankton abundance or diversity between control and treated mesocosms at the end of the study. There were also no statistically significant differences for individual taxa that were sustained between sampling points, or manifested as a concentration-response. We conclude that acute exposure to thiamethoxam at environmentally relevant concentrations (typically ng/L) likely does not represent a significant adverse ecological risk to wetland zooplankton community abundance and structure. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. UWB pulse detection and TOA estimation using GLRT

    NASA Astrophysics Data System (ADS)

    Xie, Yan; Janssen, Gerard J. M.; Shakeri, Siavash; Tiberius, Christiaan C. J. M.

    2017-12-01

    In this paper, a novel statistical approach is presented for time-of-arrival (TOA) estimation based on first path (FP) pulse detection using a sub-Nyquist sampling ultra-wide band (UWB) receiver. The TOA measurement accuracy, which cannot be improved by averaging of the received signal, can be enhanced by the statistical processing of a number of TOA measurements. The TOA statistics are modeled and analyzed for a UWB receiver using threshold crossing detection of a pulse signal with noise. The detection and estimation scheme based on the Generalized Likelihood Ratio Test (GLRT) detector, which captures the full statistical information of the measurement data, is shown to achieve accurate TOA estimation and allows for a trade-off between the threshold level, the noise level, the amplitude and the arrival time of the first path pulse, and the accuracy of the obtained final TOA.

  20. Why Are People Bad at Detecting Randomness? A Statistical Argument

    ERIC Educational Resources Information Center

    Williams, Joseph J.; Griffiths, Thomas L.

    2013-01-01

    Errors in detecting randomness are often explained in terms of biases and misconceptions. We propose and provide evidence for an account that characterizes the contribution of the inherent statistical difficulty of the task. Our account is based on a Bayesian statistical analysis, focusing on the fact that a random process is a special case of…

  1. Prediction of CpG-island function: CpG clustering vs. sliding-window methods

    PubMed Central

    2010-01-01

    Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903

  2. Validation of the ANSR Salmonella method for detection of Salmonella spp. in selected foods and environmental samples.

    PubMed

    Mozola, Mark; Norton, Paul; Alles, Susan; Gray, R Lucas; Tolan, Jerry; Caballero, Oscar; Pinkava, Lisa; Hosking, Edan; Luplow, Karen; Rice, Jennifer

    2013-01-01

    ANSR Salmonella is a new molecular diagnostic assay for detection of Salmonella spp. in foods and environmental samples. The test is based on the nicking enzyme amplification reaction (NEAR) isothermal nucleic acid amplification technology. The assay platform features simple instrumentation, minimal labor, and, following a single-step 10-24 h enrichment (depending on sample type), an extremely short assay time of 30 min, including sample preparation. Detection is real-time using fluorescent molecular beacon probes. Inclusivity testing was performed using a panel of 113 strains of S. enterica and S. bongori, representing 109 serovars and all genetic subgroups. With the single exception of the rare serovar S. Weslaco, all serovars and genetic subgroups were detected. Exclusivity testing of 38 non-salmonellae, mostly Enterobacteriaceae, yielded no evidence of cross-reactivity. In comparative testing of chicken carcass rinse, raw ground turkey, raw ground beef, hot dogs, and oat cereal, there were no statistically significant differences in the number of positive results obtained with the ANSR and the U.S. Department of Agriculture-Food Safety and Inspection Service or U.S. Food and Drug Administration/Bacteriological Analytical Manual reference culture methods. In testing of swab or sponge samples from five different environmental surfaces, four trials showed no statistically significant differences in the number of positive results by the ANSR and the U.S. Food and Drug Administration/ Bacteriological Analytical Manual reference methods; in the trial with stainless steel surface, there were significantly more positive results by the ANSR method. Ruggedness experiments showed a high degree of assay robustness when deviations in reagent volumes and incubation times were introduced.

  3. Role of specific DNA mutations in the peripheral blood of colorectal cancer patients for the assessment of tumor stage and residual disease following tumor resection

    PubMed Central

    Norcic, Gregor; Jelenc, Franc; Cerkovnik, Petra; Stegel, Vida; Novakovic, Srdjan

    2016-01-01

    In the present study, the detection of tumor-specific KRAS proto-oncogene, GTPase (KRAS) and B-Raf proto-oncogene, serine/threonine kinase (BRAF) mutations in the peripheral blood of colorectal cancer (CRC) patients at all stages and adenomas was used for the estimation of disease stage prior to surgery and for residual disease following surgery. A total of 65 CRC patients were enrolled. The primary tumor tested positive for the specific mutations (KRAS mutations in codons 12, 13, 61, 117 or 146 and BRAF mutations in codon 600) in 35 patients. In all these patients, the specimen of normal bowel resected with the tumor was also tested for the presence of the same mutations in order to exclude the germ-line mutations. Only patients who tested positive for the specific mutation in the primary tumor were included in further analysis for the presence of tumor-specific mutation in the peripheral blood. No statistically significant differences were found between the detection rates of tumor mutations in the blood and different tumor stages (P=0.491). However, statistically significant differences in the proportions of patients with detected tumor-specific DNA mutations in the peripheral blood were found when comparing the groups of patients with R0 and R2 resections (P=0.038). Tumor-specific DNA mutations in the peripheral blood were more frequently detected in the patients with an incomplete surgical clearance of the tumor due to macroscopic residual disease (R2 resections). Therefore, the study concludes that the follow-up of somatic KRAS- and BRAF-mutated DNA in the peripheral blood of CRC patients may be useful in assessing the surgical clearance of the disease. PMID:27900004

  4. Application of Abbreviated Protocol of Magnetic Resonance Imaging for Breast Cancer Screening in Dense Breast Tissue.

    PubMed

    Chen, Shuang-Qing; Huang, Min; Shen, Yu-Ying; Liu, Chen-Lu; Xu, Chuan-Xiao

    2017-03-01

    The study aimed to evaluate the usefulness of an abbreviated protocol (AP) of magnetic resonance imaging (MRI) in comparison to a full diagnostic protocol (FDP) of MRI in the breast cancer screening with dense breast tissue. There are 478 female participants with dense breast tissue and negative mammography results, who were imaged with MRI using AP and FDP. The AP and FDP images were analyzed separately, and the sensitivity and specificity of breast cancer detection were calculated. The chi-square test and receiver operating characteristics curves were used to assess the breast cancer diagnostic capabilities of the two protocols. Sixteen cases of breast cancer from 478 patients with dense breasts were detected using the FDP method, with pathologic confirmation of nine cases of ductal carcinoma in situ, six cases of invasive ductal carcinoma, and one case of mucinous carcinoma. Fifteen cases of breast cancer were successfully screened using the AP method. The sensitivity showed no obvious significant difference between AP and FDP (χ 2  = 0.592, P = 0.623), but the specificity showed a statistically significant difference (χ 2  = 4.619, P = 0.036). The receiver operating characteristics curves showed high efficacy of both methods in the detection of breast cancer in dense breast tissue (the areas under the curve were 0.931 ± 0.025 and 0.947 ± 0.024, respectively), and the ability to diagnose breast cancer was not statistically significantly different between the two methods. The AP of MRI may improve the detection rate of breast cancer in dense breast tissue, and it may be useful in efficient breast cancer screening. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  5. Detection of a gravitropism phenotype in glutamate receptor-like 3.3 mutants of Arabidopsis thaliana using machine vision and computation.

    PubMed

    Miller, Nathan D; Durham Brooks, Tessa L; Assadi, Amir H; Spalding, Edgar P

    2010-10-01

    Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca(2+)-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function.

  6. Detection of a Gravitropism Phenotype in glutamate receptor-like 3.3 Mutants of Arabidopsis thaliana Using Machine Vision and Computation

    PubMed Central

    Miller, Nathan D.; Durham Brooks, Tessa L.; Assadi, Amir H.; Spalding, Edgar P.

    2010-01-01

    Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca2+-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function. PMID:20647506

  7. A theoretical Gaussian framework for anomalous change detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Exploitation of temporal series of hyperspectral images is a relatively new discipline that has a wide variety of possible applications in fields like remote sensing, area surveillance, defense and security, search and rescue and so on. In this work, we discuss how images taken at two different times can be processed to detect changes caused by insertion, deletion or displacement of small objects in the monitored scene. This problem is known in the literature as anomalous change detection (ACD) and it can be viewed as the extension, to the multitemporal case, of the well-known anomaly detection problem in a single image. In fact, in both cases, the hyperspectral images are processed blindly in an unsupervised manner and without a-priori knowledge about the target spectrum. We introduce the ACD problem using an approach based on the statistical decision theory and we derive a common framework including different ACD approaches. Particularly, we clearly define the observation space, the data statistical distribution conditioned to the two competing hypotheses and the procedure followed to come with the solution. The proposed overview places emphasis on techniques based on the multivariate Gaussian model that allows a formal presentation of the ACD problem and the rigorous derivation of the possible solutions in a way that is both mathematically more tractable and easier to interpret. We also discuss practical problems related to the application of the detectors in the real world and present affordable solutions. Namely, we describe the ACD processing chain including the strategies that are commonly adopted to compensate pervasive radiometric changes, caused by the different illumination/atmospheric conditions, and to mitigate the residual geometric image co-registration errors. Results obtained on real freely available data are discussed in order to test and compare the methods within the proposed general framework.

  8. Comparative evaluation of root canal preparations of maxillary first molars with self-adjusting file, reciproc single file, and revo-s rotary file: A micro-computed tomography study.

    PubMed

    Ahmetoglu, Fuat; Keles, Ali; Simsek, Neslihan; Ocak, M Sinan; Yologlu, Saim

    2015-01-01

    This study was aimed to use micro-computed tomography (μ-CT) to evaluate the canal shaping properties of three nickel-titanium instruments, Self-Adjusting File (SAF), Reciproc, and Revo-S rotary file, in maxillary first molars. Thirty maxillary molars were scanned preoperatively by using micro-computed tomography (μ-CT) scans at 13,68 μm resolution. The teeth were randomly assigned to three groups (n = 10). The root canals were shaped with SAF, Reciproc, and Revo-S, respectively. The shaped root canals were rescanned. Changes in canal volumes and surface areas were compared with preoperative values. The data were analyzed using Kruskal-Wallis and Conover's post hoc tests, with p < .05 denoting a statistically significant difference. Preoperatively canal volumes and surface area were statistically similar among the three groups (p > .05). There were statistically significant differences in all measures comparing preoperative and postoperative canal models (p = 0.0001). These differences occurred after instrumentation among the three experimental groups showed no statistically significant difference for volume (p > .05). Surface area showed the similar activity in buccal canals in each of the three techniques whereas no statistically significant difference was detected among surface area, the SAF, and the Revo-S in the palatal (P) canal. Each of three shaping system showed the similar volume activity in all canals, but SAF and Revo-S provided more effectively root planning in comparison with Reciproc in P canal. © Wiley Periodicals, Inc.

  9. Infants with Williams syndrome detect statistical regularities in continuous speech.

    PubMed

    Cashon, Cara H; Ha, Oh-Ryeong; Graf Estes, Katharine; Saffran, Jenny R; Mervis, Carolyn B

    2016-09-01

    Williams syndrome (WS) is a rare genetic disorder associated with delays in language and cognitive development. The reasons for the language delay are unknown. Statistical learning is a domain-general mechanism recruited for early language acquisition. In the present study, we investigated whether infants with WS were able to detect the statistical structure in continuous speech. Eighteen 8- to 20-month-olds with WS were familiarized with 2min of a continuous stream of synthesized nonsense words; the statistical structure of the speech was the only cue to word boundaries. They were tested on their ability to discriminate statistically-defined "words" and "part-words" (which crossed word boundaries) in the artificial language. Despite significant cognitive and language delays, infants with WS were able to detect the statistical regularities in the speech stream. These findings suggest that an inability to track the statistical properties of speech is unlikely to be the primary basis for the delays in the onset of language observed in infants with WS. These results provide the first evidence of statistical learning by infants with developmental delays. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Minimum Detectable Dose as a Measure of Bioassay Programme Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.

    2003-01-01

    This paper suggests that minimum detectable dose (MDD) be used to describe the capability of bioassay programs for which intakes are expected to be rare. This allows expression of the capability in units that correspond directly to primary dose limits. The concept uses the well-established analytical statistic minimum detectable amount (MDA) as the starting point and assumes MDA detection at a prescribed time post intake. The resulting dose can then be used as an indication of the adequacy or capability of the program for demonstrating compliance with the performance criteria. MDDs can be readily tabulated or plotted to demonstrate themore » effectiveness of different types of monitoring programs. The inclusion of cost factors for bioassay measurements can allow optimisation.« less

  11. Minimum detectable dose as a measure of bioassay programme capability.

    PubMed

    Carbaugh, E H

    2003-01-01

    This paper suggests that minimum detectable dose (MDD) be used to describe the capability of bioassay programmes for which intakes are expected to be rare. This allows expression of the capability in units that correspond directly to primary dose limits. The concept uses the well established analytical statistic minimum detectable amount (MDA) as the starting point, and assumes MDA detection at a prescribed time post-intake. The resulting dose can then be used as an indication of the adequacy or capability of the programme for demonstrating compliance with the performance criteria. MDDs can be readily tabulated or plotted to demonstrate the effectiveness of different types of monitoring programmes. The inclusion of cost factors for bioassay measurements can allow optimisation.

  12. Algorithmic detectability threshold of the stochastic block model

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  13. Multipath detection with the combination of SNR measurements - Example from urban environment

    NASA Astrophysics Data System (ADS)

    Špánik, Peter; Hefty, Ján

    2017-12-01

    Multipath is one of the most severe station-dependent error sources in both static and kinematic positioning. Relatively new and simple detection technique using the Signal-to-Noise (SNR) measurements on three frequencies will be presented based on idea of Strode and Groves. Exploitation of SNR measurements is benefi cial especially for their unambiguous character. Method is based on the fact that SNR values are closely linked with estimation of pseudo-ranges and phase measurements during signal correlation processing. Due to this connection, combination of SNR values can be used to detect anomalous behavior in received signal, however some kind of calibration in low multipath environment has to be done previously. In case of multipath, phase measurements on different frequencies will not be affected in the same manner. Specular multipath, e.g. from building wall introduces additional path delay which is interpreted differently on each of the used carrier, due to different wavelengths. Experimental results of multipath detection in urban environment will be presented. Originally proposed method is designed to work with three different frequencies in each epoch, thus only utilization of GPS Block II-F and Galileo satellites is possible. Simplification of detection statistics to use only two frequencies is made and results using GPS and GLONASS systems are presented along with results obtained using original formula.

  14. Unsupervised change detection of multispectral images based on spatial constraint chi-squared transform and Markov random field model

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; Wang, Chao; Shen, Shaohong; Huang, Fengchen; Ma, Zhenli

    2016-10-01

    Chi-squared transform (CST), as a statistical method, can describe the difference degree between vectors. The CST-based methods operate directly on information stored in the difference image and are simple and effective methods for detecting changes in remotely sensed images that have been registered and aligned. However, the technique does not take spatial information into consideration, which leads to much noise in the result of change detection. An improved unsupervised change detection method is proposed based on spatial constraint CST (SCCST) in combination with a Markov random field (MRF) model. First, the mean and variance matrix of the difference image of bitemporal images are estimated by an iterative trimming method. In each iteration, spatial information is injected to reduce scattered changed points (also known as "salt and pepper" noise). To determine the key parameter confidence level in the SCCST method, a pseudotraining dataset is constructed to estimate the optimal value. Then, the result of SCCST, as an initial solution of change detection, is further improved by the MRF model. The experiments on simulated and real multitemporal and multispectral images indicate that the proposed method performs well in comprehensive indices compared with other methods.

  15. Statistics for the Relative Detectability of Chemicals in Weak Gaseous Plumes in LWIR Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.

    2008-10-30

    The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less

  16. Characterization of Inclusion Populations in Mn-Si Deoxidized Steel

    NASA Astrophysics Data System (ADS)

    García-Carbajal, Alfonso; Herrera-Trejo, Martín; Castro-Cedeño, Edgar-Ivan; Castro-Román, Manuel; Martinez-Enriquez, Arturo-Isaias

    2017-12-01

    Four plant heats of Mn-Si deoxidized steel were conducted to follow the evolution of the inclusion population through ladle furnace (LF) treatment and subsequent vacuum treatment (VT). The liquid steel was sampled, and the chemical composition and size distribution of the inclusion populations were characterized. The Gumbel generalized extreme-value (GEV) and generalized Pareto (GP) distributions were used for the statistical analysis of the inclusion size distributions. The inclusions found at the beginning of the LF treatment were mostly fully liquid SiO2-Al2O3-MnO inclusions, which then evolved into fully liquid SiO2-Al2O3-CaO-MgO and partly liquid SiO2-CaO-MgO-(Al2O3-MgO) inclusions detected at the end of the VT. The final fully liquid inclusions had a desirable chemical composition for plastic behavior in subsequent metallurgical operations. The GP distribution was found to be undesirable for statistical analysis. The GEV distribution approach led to shape parameter values different from the zero value hypothesized from the Gumbel distribution. According to the GEV approach, some of the final inclusion size distributions had statistically significant differences, whereas the Gumbel approach predicted no statistically significant differences. The heats were organized according to indicators of inclusion cleanliness and a statistical comparison of the size distributions.

  17. Finding Statistically Significant Communities in Networks

    PubMed Central

    Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo

    2011-01-01

    Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480

  18. Cine phase-contrast MRI evaluation of normal aqueductal cerebrospinal fluid flow according to sex and age.

    PubMed

    Unal, Ozkan; Kartum, Alp; Avcu, Serhat; Etlik, Omer; Arslan, Halil; Bora, Aydin

    2009-12-01

    The aim of this study was cerebrospinal flow quantification in the cerebral aqueduct using cine phase-contrast magnetic resonance imaging (MRI) technique in both sexes and five different age groups to provide normative data. Sixty subjects with no cerebral pathology were included in this study. Subjects were divided into five age groups: < or =14 years, 15-24 years, 25-34 years, 35-44 years, and > or =45 years. Phase, rephase, and magnitude images were acquired by 1.5 T MR unit at the level of cerebral aqueduct with spoiled gradient echo through-plane, which is a cine phase-contrast sequence. At this level, peak flow velocity (cm/s), average flow rate (cm/ s), average flow (L/min), volumes in cranial and caudal directions (mL), and net volumes (mL) were studied. There was a statistically significant difference in peak flow between the age group of < or =14 years and the older age groups. There were no statistically significant differences in average velocity, cranial and caudal volume, net volume, and average flow parameters among different age groups. Statistically significant differences were not detected in flow parameters between sexes. When using cine-phase contrast MRI in the cerebral aqueduct, only the peak velocity showed a statistically significant difference between age groups; it was higher in subjects aged < or =14 years than those in older age groups. When performing age-dependent clinical studies including adolescents, this should be taken into consideration.

  19. Tumor or abnormality identification from magnetic resonance images using statistical region fusion based segmentation.

    PubMed

    Subudhi, Badri Narayan; Thangaraj, Veerakumar; Sankaralingam, Esakkirajan; Ghosh, Ashish

    2016-11-01

    In this article, a statistical fusion based segmentation technique is proposed to identify different abnormality in magnetic resonance images (MRI). The proposed scheme follows seed selection, region growing-merging and fusion of multiple image segments. In this process initially, an image is divided into a number of blocks and for each block we compute the phase component of the Fourier transform. The phase component of each block reflects the gray level variation among the block but contains a large correlation among them. Hence a singular value decomposition (SVD) technique is adhered to generate a singular value of each block. Then a thresholding procedure is applied on these singular values to identify edgy and smooth regions and some seed points are selected for segmentation. By considering each seed point we perform a binary segmentation of the complete MRI and hence with all seed points we get an equal number of binary images. A parcel based statistical fusion process is used to fuse all the binary images into multiple segments. Effectiveness of the proposed scheme is tested on identifying different abnormalities: prostatic carcinoma detection, tuberculous granulomas identification and intracranial neoplasm or brain tumor detection. The proposed technique is established by comparing its results against seven state-of-the-art techniques with six performance evaluation measures. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. The importance of consumption of the epidermis in malignant melanoma and correlation with clinicopathological prognostic parameters.

    PubMed

    Seçkin, Selda; Ozgũn, Elmas

    2011-01-01

    The aim of the study was to investigate the importance of consumption of the epidermis as an additional diagnostic criteria for malignant melanoma and to evaluate its relationship to clinicopathological findings. The age, gender, localization of the lesion and the histopathological parameters such as tumor type, Breslow thickness, ulceration, Clark's level, mitosis/mm2, lymphocytic infiltration were noted in 40 malignant melanoma cases. Consumption of the epidermis was evaluated in tumor sections. Consumption of the epidermis (COE) due to thinning of the epidermis and loss of rete ridges was noted as (+) or (-). Furthermore, COE was compared with clinical and histopathological parameters. The Shapiro Wilk and Logistic Regression tests were used for statistical analysis. The results were accepted as significant if the p value was < 0.05. COE was detected in 60% (24/40) of malignant melanoma cases. A positive correlation was present between COE and head and neck localization (p = 0,698), superficial spreading melanoma (p = 0,341), ulceration (p = 0,097) and brisk lymphocytic infiltration (p = 0,200) but the results were not statistically significant. COE was frequently detected in males but the difference was not statistically significant (p = 0.796). There was no correlation or significant statistical association between COE and age, Breslow thickness, Clark's level or the mitotic index. The detection of COE in most of the patients suggests that COE could be a histopathological criterion in the diagnosis of malignant melanoma. The frequent association between COE and the presence of ulceration could also direct attention to COE as regards prognostic importance.

  1. Statistical method evaluation for differentially methylated CpGs in base resolution next-generation DNA sequencing data.

    PubMed

    Zhang, Yun; Baheti, Saurabh; Sun, Zhifu

    2018-05-01

    High-throughput bisulfite methylation sequencing such as reduced representation bisulfite sequencing (RRBS), Agilent SureSelect Human Methyl-Seq (Methyl-seq) or whole-genome bisulfite sequencing is commonly used for base resolution methylome research. These data are represented either by the ratio of methylated cytosine versus total coverage at a CpG site or numbers of methylated and unmethylated cytosines. Multiple statistical methods can be used to detect differentially methylated CpGs (DMCs) between conditions, and these methods are often the base for the next step of differentially methylated region identification. The ratio data have a flexibility of fitting to many linear models, but the raw count data take consideration of coverage information. There is an array of options in each datatype for DMC detection; however, it is not clear which is an optimal statistical method. In this study, we systematically evaluated four statistic methods on methylation ratio data and four methods on count-based data and compared their performances with regard to type I error control, sensitivity and specificity of DMC detection and computational resource demands using real RRBS data along with simulation. Our results show that the ratio-based tests are generally more conservative (less sensitive) than the count-based tests. However, some count-based methods have high false-positive rates and should be avoided. The beta-binomial model gives a good balance between sensitivity and specificity and is preferred method. Selection of methods in different settings, signal versus noise and sample size estimation are also discussed.

  2. Impact of same day vs day before pre-operative lymphoscintigraphy for sentinel lymph node biopsy for early breast cancer (local Australian experience).

    PubMed

    Huang, Yang Yang; Maurel, Amelie; Hamza, Saud; Jackson, Lee; Al-Ogaili, Zeyad

    2018-06-01

    To assess the impact of delayed vs immediate pre-operative lymphoscintigraphy (LSG) for sentinel lymph node biopsy in a single Australian tertiary breast cancer centre. Retrospective cohort study analysing patients with breast cancer or DCIS who underwent lumpectomy or mastectomy with pre-operative LSG and intra-operative sentinel lymph node biopsy from January 2015 to June 2016. A total of 182 LSG were performed. Group A patients had day before pre-operative LSG mapping (n = 79) and Group B had LSG mapping on the day of surgery (n = 103). The overall LSG localisation rate was 97.3% and no statistical difference was detected between the two groups. The overall sentinel lymph node biopsies (SLN) were identified in 99.6% of patients. The number of nodes excised was slightly higher in Group A (1.90 vs 1.72); however, this was not statistically significant. In addition, the number of nodes on histopathology and the incidence of second echelon nodal detection were also similar between the two groups without statistical significance. In conclusion, the 2-day LSG protocol had no impact on overall SLNB and LSG detection rates although slightly higher second tier nodes but this did not translate to any difference between the number of harvest nodes between the two groups. The 2-day LSG allows for greater flexibility in theatre planning and more efficient use of theatre time. We recommend a dose of 40 Mbq of Tc99 m pertechnetate-labelled colloid be given day prior to surgery within a 24-hour timeframe. © 2017 The Royal Australian and New Zealand College of Radiologists.

  3. REANALYSIS OF F-STATISTIC GRAVITATIONAL-WAVE SEARCHES WITH THE HIGHER CRITICISM STATISTIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, M. F.; Melatos, A.; Delaigle, A.

    2013-04-01

    We propose a new method of gravitational-wave detection using a modified form of higher criticism, a statistical technique introduced by Donoho and Jin. Higher criticism is designed to detect a group of sparse, weak sources, none of which are strong enough to be reliably estimated or detected individually. We apply higher criticism as a second-pass method to synthetic F-statistic and C-statistic data for a monochromatic periodic source in a binary system and quantify the improvement relative to the first-pass methods. We find that higher criticism on C-statistic data is more sensitive by {approx}6% than the C-statistic alone under optimal conditionsmore » (i.e., binary orbit known exactly) and the relative advantage increases as the error in the orbital parameters increases. Higher criticism is robust even when the source is not monochromatic (e.g., phase-wandering in an accreting system). Applying higher criticism to a phase-wandering source over multiple time intervals gives a {approx}> 30% increase in detectability with few assumptions about the frequency evolution. By contrast, in all-sky searches for unknown periodic sources, which are dominated by the brightest source, second-pass higher criticism does not provide any benefits over a first-pass search.« less

  4. Impact of 1p/19q codeletion on the diagnosis and prognosis of different grades of meningioma.

    PubMed

    Basaran, Recep; Uslu, Serap; Gucluer, Berrin; Onoz, Mustafa; Isik, Nejat; Tiryaki, Mehmet; Yakicier, Cengiz; Sav, Aydin; Elmaci, Ilhan

    2016-10-01

    Meningiomas are one of the most common tumours to affect the central nervous system. Genetic mutations are important in meningeal tumourigenesis, progression and prognosis. In this study, we aimed to examine the effect of 1p/19q deletion on the diagnosis and prognosis of meningioma subtypes using the fluorescence in situ hybridization (FISH) method. Twenty-four patients with meningioma were retrospectively studied. Tumour samples were obtained from 10 typical, 11 atypical and three anaplastic malignant meningiomas. The most representative tumour sections were screened for 1p/19q deletion using the FISH method. Of the 24 patients, eight were women (33.3%) and 16 (66.7%) were men. The mean age was 56.6 years. The higher-grade meningioma was usually seen in males and had a higher rate of deletion on 1p (p = 0.001). There was a statistically significant difference between the grades and the rate of deletion on 19q (p = 0.042) and between the grades and the rates of polysomy, monosomy and amplification on 19q (p = 0.002; p = 0.001; p = 0.002, respectively). There was no statistical difference between 1p/19q codeletion and the grades of meningioma (p > 0.05). We detected higher level of Ki-67 in the condition of codeletion but did not find a statistical difference (p = 0.0553). Deletion on 1p, as well as deletion, polysomy, monosomy and amplification on 19q, are detected more frequently in high grade meningiomas. This amplification is most likely due to the amplification of oncogenes.

  5. Evaluation of satellite rainfall estimates for drought and flood monitoring in Mozambique

    USGS Publications Warehouse

    Tote, Carolien; Patricio, Domingos; Boogaard, Hendrik; van der Wijngaart, Raymond; Tarnavsky, Elena; Funk, Christopher C.

    2015-01-01

    Satellite derived rainfall products are useful for drought and flood early warning and overcome the problem of sparse, unevenly distributed and erratic rain gauge observations, provided their accuracy is well known. Mozambique is highly vulnerable to extreme weather events such as major droughts and floods and thus, an understanding of the strengths and weaknesses of different rainfall products is valuable. Three dekadal (10-day) gridded satellite rainfall products (TAMSAT African Rainfall Climatology And Time-series (TARCAT) v2.0, Famine Early Warning System NETwork (FEWS NET) Rainfall Estimate (RFE) v2.0, and Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS)) are compared to independent gauge data (2001–2012). This is done using pairwise comparison statistics to evaluate the performance in estimating rainfall amounts and categorical statistics to assess rain-detection capabilities. The analysis was performed for different rainfall categories, over the seasonal cycle and for regions dominated by different weather systems. Overall, satellite products overestimate low and underestimate high dekadal rainfall values. The RFE and CHIRPS products perform as good, generally outperforming TARCAT on the majority of statistical measures of skill. TARCAT detects best the relative frequency of rainfall events, while RFE underestimates and CHIRPS overestimates the rainfall events frequency. Differences in products performance disappear with higher rainfall and all products achieve better results during the wet season. During the cyclone season, CHIRPS shows the best results, while RFE outperforms the other products for lower dekadal rainfall. Products blending thermal infrared and passive microwave imagery perform better than infrared only products and particularly when meteorological patterns are more complex, such as over the coastal, central and south regions of Mozambique, where precipitation is influenced by frontal systems.

  6. Could edaravone prevent gentamicin ototoxicity? An experimental study.

    PubMed

    Turan, M; Ciğer, E; Arslanoğlu, S; Börekci, H; Önal, K

    2017-02-01

    Clinical application of gentamicin may cause nephrotoxicity and ototoxicity. Our study is the first study to investigate the protective effects of edaravone against the gentamicin-induced ototoxicity. We investigated the protective effect of intraperitoneal (i.p.) edaravone application against gentamicin-induced ototoxicity in guinea pigs. Fourteen guinea pigs were divided into two equal groups consisting of a control group and a study group. One-hundred sixty milligrams per kilogram subcutaneous gentamicin and 0.3 mL i.p. saline were applied simultaneously once daily to seven guinea pigs in the control group (group 1). One-hundred sixty milligrams per kilogram gentamicin was applied subcutaneously and 3 mg/kg edaravone was applied intraperitoneally once daily for 7 days simultaneously to seven guinea pigs in the study group (group 2). Following the drug application, auditory brainstem response measurements were performed for the left ear on the 3rd and 7th days. Hearing threshold values of the group 1 and group 2 measured in the 3rd day of the study were detected as 57.14 ± 4.88 and 82.86 ± 7.56, respectively. This difference was statistically significant ( p < 0.05). Hearing threshold values of the group 1 and group 2 measured in the 7th day of the study were detected as 87.14 ± 4.88 and 62.86 ± 4.88, respectively. This difference was statistically significant ( p < 0.05). A statistically significant difference between the average threshold values of edaravone-administered group 2 and that of group 1 without edaravone was found. These differences show that systemic edaravone administration could diminish ototoxic effects of gentamicin and the severity of the hearing loss.

  7. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted.

  8. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  9. Subjective memory complaints, depressive symptoms and cognition in patients attending a memory outpatient clinic.

    PubMed

    Lehrner, J; Moser, D; Klug, S; Gleiß, A; Auff, E; Dal-Bianco, P; Pusswald, G

    2014-03-01

    The goals of this study were to establish prevalence of subjective memory complaints (SMC) and depressive symptoms (DS) and their relation to cognitive functioning and cognitive status in an outpatient memory clinic cohort. Two hundred forty-eight cognitively healthy controls and 581 consecutive patients with cognitive complaints who fulfilled the inclusion criteria were included in the study. A statistically significant difference (p < 0.001) between control group and patient group regarding mean SMC was detected. 7.7% of controls reported a considerable degree of SMC, whereas 35.8% of patients reported considerable SMC. Additionally, a statistically significant difference (p < 0.001) between controls and patient group regarding Beck depression score was detected. 16.6% of controls showed a clinical relevant degree of DS, whereas 48.5% of patients showed DS. An analysis of variance revealed a statistically significant difference across all four groups (control group, SCI group, naMCI group, aMCI group) (p < 0.001). Whereas 8% of controls reported a considerable degree of SMC, 34% of the SCI group, 31% of the naMCI group, and 54% of the aMCI group reported considerable SMC. A two-factor analysis of variance with the factors cognitive status (controls, SCI group, naMCI group, aMCI group) and depressive status (depressed vs. not depressed) and SMC as dependent variable revealed that both factors were significant (p < 0.001), whereas the interaction was not (p = 0.820). A large proportion of patients seeking help in a memory outpatient clinic report considerable SMC, with an increasing degree from cognitively healthy elderly to aMCI. Depressive status increases SMC consistently across groups with different cognitive status.

  10. [Gohieria fusca found in dust of air-conditioner filters].

    PubMed

    Qiang, Chai; Xiao-Dong, Zhan; Wei, Guo; Chao-Pin, Li

    2017-09-25

    To investigate the pollution status of Gohieria fusca in the air conditioner-filters of different places in Wuhu City. The dust samples were collected from the filters of air-conditioners in dining rooms, shopping malls, hotels and households between June and September, 2013, and G. fusca was detected in the dust samples. There were 430 dust samples collected and 98 were G. fusca positive with the breeding rate of 22.79%. The difference of breeding rates of G. fusca were statistically significant among the different places ( χ 2 =18.294, P < 0.05). Among 510.5 g dust samples in total, 783 G. fusca mites were detected with an average breeding density of 1.53 mite/g. G. fusca breeds in the dust of air-conditioner filters in Wuhu City gravely.

  11. Imaging different components of a tectonic tremor sequence in southwestern Japan using an automatic statistical detection and location method

    NASA Astrophysics Data System (ADS)

    Poiata, Natalia; Vilotte, Jean-Pierre; Bernard, Pascal; Satriano, Claudio; Obara, Kazushige

    2018-06-01

    In this study, we demonstrate the capability of an automatic network-based detection and location method to extract and analyse different components of tectonic tremor activity by analysing a 9-day energetic tectonic tremor sequence occurring at the downdip extension of the subducting slab in southwestern Japan. The applied method exploits the coherency of multiscale, frequency-selective characteristics of non-stationary signals recorded across the seismic network. Use of different characteristic functions, in the signal processing step of the method, allows to extract and locate the sources of short-duration impulsive signal transients associated with low-frequency earthquakes and of longer-duration energy transients during the tectonic tremor sequence. Frequency-dependent characteristic functions, based on higher-order statistics' properties of the seismic signals, are used for the detection and location of low-frequency earthquakes. This allows extracting a more complete (˜6.5 times more events) and time-resolved catalogue of low-frequency earthquakes than the routine catalogue provided by the Japan Meteorological Agency. As such, this catalogue allows resolving the space-time evolution of the low-frequency earthquakes activity in great detail, unravelling spatial and temporal clustering, modulation in response to tide, and different scales of space-time migration patterns. In the second part of the study, the detection and source location of longer-duration signal energy transients within the tectonic tremor sequence is performed using characteristic functions built from smoothed frequency-dependent energy envelopes. This leads to a catalogue of longer-duration energy sources during the tectonic tremor sequence, characterized by their durations and 3-D spatial likelihood maps of the energy-release source regions. The summary 3-D likelihood map for the 9-day tectonic tremor sequence, built from this catalogue, exhibits an along-strike spatial segmentation of the long-duration energy-release regions, matching the large-scale clustering features evidenced from the low-frequency earthquake's activity analysis. Further examination of the two catalogues showed that the extracted short-duration low-frequency earthquakes activity coincides in space, within about 10-15 km distance, with the longer-duration energy sources during the tectonic tremor sequence. This observation provides a potential constraint on the size of the longer-duration energy-radiating source region in relation with the clustering of low-frequency earthquakes activity during the analysed tectonic tremor sequence. We show that advanced statistical network-based methods offer new capabilities for automatic high-resolution detection, location and monitoring of different scale-components of tectonic tremor activity, enriching existing slow earthquakes catalogues. Systematic application of such methods to large continuous data sets will allow imaging the slow transient seismic energy-release activity at higher resolution, and therefore, provide new insights into the underlying multiscale mechanisms of slow earthquakes generation.

  12. Imaging different components of a tectonic tremor sequence in southwestern Japan using an automatic statistical detection and location method

    NASA Astrophysics Data System (ADS)

    Poiata, Natalia; Vilotte, Jean-Pierre; Bernard, Pascal; Satriano, Claudio; Obara, Kazushige

    2018-02-01

    In this study, we demonstrate the capability of an automatic network-based detection and location method to extract and analyse different components of tectonic tremor activity by analysing a 9-day energetic tectonic tremor sequence occurring at the down-dip extension of the subducting slab in southwestern Japan. The applied method exploits the coherency of multi-scale, frequency-selective characteristics of non-stationary signals recorded across the seismic network. Use of different characteristic functions, in the signal processing step of the method, allows to extract and locate the sources of short-duration impulsive signal transients associated with low-frequency earthquakes and of longer-duration energy transients during the tectonic tremor sequence. Frequency-dependent characteristic functions, based on higher-order statistics' properties of the seismic signals, are used for the detection and location of low-frequency earthquakes. This allows extracting a more complete (˜6.5 times more events) and time-resolved catalogue of low-frequency earthquakes than the routine catalogue provided by the Japan Meteorological Agency. As such, this catalogue allows resolving the space-time evolution of the low-frequency earthquakes activity in great detail, unravelling spatial and temporal clustering, modulation in response to tide, and different scales of space-time migration patterns. In the second part of the study, the detection and source location of longer-duration signal energy transients within the tectonic tremor sequence is performed using characteristic functions built from smoothed frequency-dependent energy envelopes. This leads to a catalogue of longer-duration energy sources during the tectonic tremor sequence, characterized by their durations and 3-D spatial likelihood maps of the energy-release source regions. The summary 3-D likelihood map for the 9-day tectonic tremor sequence, built from this catalogue, exhibits an along-strike spatial segmentation of the long-duration energy-release regions, matching the large-scale clustering features evidenced from the low-frequency earthquake's activity analysis. Further examination of the two catalogues showed that the extracted short-duration low-frequency earthquakes activity coincides in space, within about 10-15 km distance, with the longer-duration energy sources during the tectonic tremor sequence. This observation provides a potential constraint on the size of the longer-duration energy-radiating source region in relation with the clustering of low-frequency earthquakes activity during the analysed tectonic tremor sequence. We show that advanced statistical network-based methods offer new capabilities for automatic high-resolution detection, location and monitoring of different scale-components of tectonic tremor activity, enriching existing slow earthquakes catalogues. Systematic application of such methods to large continuous data sets will allow imaging the slow transient seismic energy-release activity at higher resolution, and therefore, provide new insights into the underlying multi-scale mechanisms of slow earthquakes generation.

  13. The HI Content of Galaxies as a Function of Local Density and Large-Scale Environment

    NASA Astrophysics Data System (ADS)

    Thoreen, Henry; Cantwell, Kelly; Maloney, Erin; Cane, Thomas; Brough Morris, Theodore; Flory, Oscar; Raskin, Mark; Crone-Odekon, Mary; ALFALFA Team

    2017-01-01

    We examine the HI content of galaxies as a function of environment, based on a catalogue of 41527 galaxies that are part of the 70% complete Arecibo Legacy Fast-ALFA (ALFALFA) survey. We use nearest-neighbor methods to characterize local environment, and a modified version of the algorithm developed for the Galaxy and Mass Assembly (GAMA) survey to classify large-scale environment as group, filament, tendril, or void. We compare the HI content in these environments using statistics that include both HI detections and the upper limits on detections from ALFALFA. The large size of the sample allows to statistically compare the HI content in different environments for early-type galaxies as well as late-type galaxies. This work is supported by NSF grants AST-1211005 and AST-1637339, the Skidmore Faculty-Student Summer Research program, and the Schupf Scholars program.

  14. Whole-body concentrations of elements in three fish species from offshore oil platforms and natural areas in the Southern California Bight, USA

    USGS Publications Warehouse

    Love, Milton S.; Saiki, Michael K.; May, Thomas W.; Yee, Julie L.

    2013-01-01

    elements. Forty-two elements were excluded from statistical comparisons as they (1) consisted of major cations that were unlikely to accumulate to potentially toxic concentrations; (2) were not detected by the analytical procedures; or (3) were detected at concentrations too low to yield reliable quantitative measurements. The remaining 21 elements consisted of aluminum, arsenic, barium, cadmium, chromium, cobalt, copper, gallium, iron, lead, lithium, manganese, mercury, nickel, rubidium, selenium, strontium, tin, titanium, vanadium, and zinc. Statistical comparisons of these elements indicated that none consistently exhibited higher concentrations at oil platforms than at natural areas. However, the concentrations of copper, selenium, titanium, and vanadium in Pacific sanddab were unusual because small individuals exhibited either no differences between oil platforms and natural areas or significantly lower concentrations at oil platforms than at natural areas, whereas large individuals exhibited significantly higher concentrations at oil platforms than at natural areas.

  15. Evaluation and Applications of Cloud Climatologies from CALIOP

    NASA Technical Reports Server (NTRS)

    Winker, David; Getzewitch, Brian; Vaughan, Mark

    2008-01-01

    Clouds have a major impact on the Earth radiation budget and differences in the representation of clouds in global climate models are responsible for much of the spread in predicted climate sensitivity. Existing cloud climatologies, against which these models can be tested, have many limitations. The CALIOP lidar, carried on the CALIPSO satellite, has now acquired over two years of nearly continuous cloud and aerosol observations. This dataset provides an improved basis for the characterization of 3-D global cloudiness. Global average cloud cover measured by CALIOP is about 75%, significantly higher than for existing cloud climatologies due to the sensitivity of CALIOP to optically thin cloud. Day/night biases in cloud detection appear to be small. This presentation will discuss detection sensitivity and other issues associated with producing a cloud climatology, characteristics of cloud cover statistics derived from CALIOP data, and applications of those statistics.

  16. Detecting Answer Copying Using Alternate Test Forms and Seat Locations in Small-Scale Examinations

    ERIC Educational Resources Information Center

    van der Ark, L. Andries; Emons, Wilco H. M.; Sijtsma, Klaas

    2008-01-01

    Two types of answer-copying statistics for detecting copiers in small-scale examinations are proposed. One statistic identifies the "copier-source" pair, and the other in addition suggests who is copier and who is source. Both types of statistics can be used when the examination has alternate test forms. A simulation study shows that the…

  17. CAD scheme for detection of hemorrhages and exudates in ocular fundus images

    NASA Astrophysics Data System (ADS)

    Hatanaka, Yuji; Nakagawa, Toshiaki; Hayashi, Yoshinori; Mizukusa, Yutaka; Fujita, Akihiro; Kakogawa, Masakatsu; Kawase, Kazuhide; Hara, Takeshi; Fujita, Hiroshi

    2007-03-01

    This paper describes a method for detecting hemorrhages and exudates in ocular fundus images. The detection of hemorrhages and exudates is important in order to diagnose diabetic retinopathy. Diabetic retinopathy is one of the most significant factors contributing to blindness, and early detection and treatment are important. In this study, hemorrhages and exudates were automatically detected in fundus images without using fluorescein angiograms. Subsequently, the blood vessel regions incorrectly detected as hemorrhages were eliminated by first examining the structure of the blood vessels and then evaluating the length-to-width ratio. Finally, the false positives were eliminated by checking the following features extracted from candidate images: the number of pixels, contrast, 13 features calculated from the co-occurrence matrix, two features based on gray-level difference statistics, and two features calculated from the extrema method. The sensitivity of detecting hemorrhages in the fundus images was 85% and that of detecting exudates was 77%. Our fully automated scheme could accurately detect hemorrhages and exudates.

  18. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  19. DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.

    PubMed

    Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A

    2017-01-01

    Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.

  20. Quantifying (dis)agreement between direct detection experiments in a halo-independent way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldstein, Brian; Kahlhoefer, Felix, E-mail: brian.feldstein@physics.ox.ac.uk, E-mail: felix.kahlhoefer@physics.ox.ac.uk

    We propose an improved method to study recent and near-future dark matter direct detection experiments with small numbers of observed events. Our method determines in a quantitative and halo-independent way whether the experiments point towards a consistent dark matter signal and identifies the best-fit dark matter parameters. To achieve true halo independence, we apply a recently developed method based on finding the velocity distribution that best describes a given set of data. For a quantitative global analysis we construct a likelihood function suitable for small numbers of events, which allows us to determine the best-fit particle physics properties of darkmore » matter considering all experiments simultaneously. Based on this likelihood function we propose a new test statistic that quantifies how well the proposed model fits the data and how large the tension between different direct detection experiments is. We perform Monte Carlo simulations in order to determine the probability distribution function of this test statistic and to calculate the p-value for both the dark matter hypothesis and the background-only hypothesis.« less

  1. Detection of occult paroxysmal atrial fibrilation by implantable long-term electrocardiographic monitoring in cryptogenic stroke and transient ischemic attack population: a study protocol for prospective matched cohort study.

    PubMed

    Petrovičová, Andrea; Kurča, Egon; Brozman, Miroslav; Hasilla, Jozef; Vahala, Pavel; Blaško, Peter; Andrášová, Andrea; Hatala, Robert; Urban, Luboš; Sivák, Štefan

    2015-12-03

    Cardio-embolic etiology is the most frequently predicted cause of cryptogenic stroke/TIA. Detection of occult paroxysmal atrial fibrillation is crucial for selection of appropriate medication. Enrolment of eligible cryptogenic stroke and TIA patients began in 2014 and will continue until 2018. The patients undergo long-term (12 months) ECG monitoring (implantable loop recorder) and testing for PITX2 (chromosome 4q25) and ZFHX3 (chromosome 16q22) gene mutations. There will be an appropriate control group of age- and sex-matched healthy volunteers. To analyse the results descriptive statistics, statistical tests for group differences, and correlation analyses will be used. In our study we are focusing on a possible correlation between detection of atrial fibrillation by an implantable ECG recorder, and PITX2 and/or ZFHX3 gene mutations in cryptogenic stroke/TIA patients. A correlation could lead to implementation of this genomic approach to cryptogenic stroke/TIA diagnostics and management. The results will be published in 2018. ClinicalTrials.gov: NCT02216370 .

  2. Spatial analysis for the epidemiological study of cardiovascular diseases: A systematic literature search.

    PubMed

    Mena, Carlos; Sepúlveda, Cesar; Fuentes, Eduardo; Ormazábal, Yony; Palomo, Iván

    2018-05-07

    Cardiovascular diseases (CVDs) are the primary cause of death and disability in de world, and the detection of populations at risk as well as localization of vulnerable areas is essential for adequate epidemiological management. Techniques developed for spatial analysis, among them geographical information systems and spatial statistics, such as cluster detection and spatial correlation, are useful for the study of the distribution of the CVDs. These techniques, enabling recognition of events at different geographical levels of study (e.g., rural, deprived neighbourhoods, etc.), make it possible to relate CVDs to factors present in the immediate environment. The systemic literature presented here shows that this group of diseases is clustered with regard to incidence, mortality and hospitalization as well as obesity, smoking, increased glycated haemoglobin levels, hypertension physical activity and age. In addition, acquired variables such as income, residency (rural or urban) and education, contribute to CVD clustering. Both local cluster detection and spatial regression techniques give statistical weight to the findings providing valuable information that can influence response mechanisms in the health services by indicating locations in need of intervention and assignment of available resources.

  3. Detection of proximal caries using digital radiographic systems with different resolutions.

    PubMed

    Nikneshan, Sima; Abbas, Fatemeh Mashhadi; Sabbagh, Sedigheh

    2015-01-01

    Dental radiography is an important tool for detection of caries and digital radiography is the latest advancement in this regard. Spatial resolution is a characteristic of digital receptors used for describing the quality of images. This study was aimed to compare the diagnostic accuracy of two digital radiographic systems with three different resolutions for detection of noncavitated proximal caries. Diagnostic accuracy. Seventy premolar teeth were mounted in 14 gypsum blocks. Digora; Optime and RVG Access were used for obtaining digital radiographs. Six observers evaluated the proximal surfaces in radiographs for each resolution in order to determine the depth of caries based on a 4-point scale. The teeth were then histologically sectioned, and the results of histologic analysis were considered as the gold standard. Data were entered using SPSS version 18 software and the Kruskal-Wallis test was used for data analysis. P <0.05 was considered as statistically significant. No significant difference was found between different resolutions for detection of proximal caries (P > 0.05). RVG access system had the highest specificity (87.7%) and Digora; Optime at high resolution had the lowest specificity (84.2%). Furthermore, Digora; Optime had higher sensitivity for detection of caries exceeding outer half of enamel. Judgment of oral radiologists for detection of the depth of caries had higher reliability than that of restorative dentistry specialists. The three resolutions of Digora; Optime and RVG access had similar accuracy in detection of noncavitated proximal caries.

  4. Epidermis area detection for immunofluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dovganich, Andrey; Krylov, Andrey; Nasonov, Andrey; Makhneva, Natalia

    2018-04-01

    We propose a novel image segmentation method for immunofluorescence microscopy images of skin tissue for the diagnosis of various skin diseases. The segmentation is based on machine learning algorithms. The feature vector is filled by three groups of features: statistical features, Laws' texture energy measures and local binary patterns. The images are preprocessed for better learning. Different machine learning algorithms have been used and the best results have been obtained with random forest algorithm. We use the proposed method to detect the epidermis region as a part of pemphigus diagnosis system.

  5. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification

    PubMed Central

    Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Background Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). Purpose To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). Material and Methods This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Results Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67–0.89) compared to L-ASIR or UL-ASIR (0.11–0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818–0.860) was comparable to that for L-ASIR (0.696–0.844). The specificity was lower with UL-MBIR (0.79–0.92) than with L-ASIR or UL-ASIR (0.96–0.99), and a significant difference was seen for one reader (P < 0.01). Conclusion In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity. PMID:27110389

  6. Detecting rater bias using a person-fit statistic: a Monte Carlo simulation study.

    PubMed

    Aubin, André-Sébastien; St-Onge, Christina; Renaud, Jean-Sébastien

    2018-04-01

    With the Standards voicing concern for the appropriateness of response processes, we need to explore strategies that would allow us to identify inappropriate rater response processes. Although certain statistics can be used to help detect rater bias, their use is complicated by either a lack of data about their actual power to detect rater bias or the difficulty related to their application in the context of health professions education. This exploratory study aimed to establish the worthiness of pursuing the use of l z to detect rater bias. We conducted a Monte Carlo simulation study to investigate the power of a specific detection statistic, that is: the standardized likelihood l z person-fit statistics (PFS). Our primary outcome was the detection rate of biased raters, namely: raters whom we manipulated into being either stringent (giving lower scores) or lenient (giving higher scores), using the l z statistic while controlling for the number of biased raters in a sample (6 levels) and the rate of bias per rater (6 levels). Overall, stringent raters (M = 0.84, SD = 0.23) were easier to detect than lenient raters (M = 0.31, SD = 0.28). More biased raters were easier to detect then less biased raters (60% bias: 62, SD = 0.37; 10% bias: 43, SD = 0.36). The PFS l z seems to offer an interesting potential to identify biased raters. We observed detection rates as high as 90% for stringent raters, for whom we manipulated more than half their checklist. Although we observed very interesting results, we cannot generalize these results to the use of PFS with estimated item/station parameters or real data. Such studies should be conducted to assess the feasibility of using PFS to identify rater bias.

  7. Quantum walks: The first detected passage time problem

    NASA Astrophysics Data System (ADS)

    Friedman, H.; Kessler, D. A.; Barkai, E.

    2017-03-01

    Even after decades of research, the problem of first passage time statistics for quantum dynamics remains a challenging topic of fundamental and practical importance. Using a projective measurement approach, with a sampling time τ , we obtain the statistics of first detection events for quantum dynamics on a lattice, with the detector located at the origin. A quantum renewal equation for a first detection wave function, in terms of which the first detection probability can be calculated, is derived. This formula gives the relation between first detection statistics and the solution of the corresponding Schrödinger equation in the absence of measurement. We illustrate our results with tight-binding quantum walk models. We examine a closed system, i.e., a ring, and reveal the intricate influence of the sampling time τ on the statistics of detection, discussing the quantum Zeno effect, half dark states, revivals, and optimal detection. The initial condition modifies the statistics of a quantum walk on a finite ring in surprising ways. In some cases, the average detection time is independent of the sampling time while in others the average exhibits multiple divergences as the sampling time is modified. For an unbounded one-dimensional quantum walk, the probability of first detection decays like (time)(-3 ) with superimposed oscillations, with exceptional behavior when the sampling period τ times the tunneling rate γ is a multiple of π /2 . The amplitude of the power-law decay is suppressed as τ →0 due to the Zeno effect. Our work, an extended version of our previously published paper, predicts rich physical behaviors compared with classical Brownian motion, for which the first passage probability density decays monotonically like (time)-3 /2, as elucidated by Schrödinger in 1915.

  8. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  9. The detectability of brain metastases using contrast-enhanced spin-echo or gradient-echo images: a systematic review and meta-analysis.

    PubMed

    Suh, Chong Hyun; Jung, Seung Chai; Kim, Kyung Won; Pyo, Junhee

    2016-09-01

    This study aimed to compare the detectability of brain metastases using contrast-enhanced spin-echo (SE) and gradient-echo (GRE) T1-weighted images. The Ovid-MEDLINE and EMBASE databases were searched for studies on the detectability of brain metastases using contrast-enhanced SE or GRE images. The pooled proportions for the detectability of brain metastases were assessed using random-effects modeling. Heterogeneity among studies was determined using χ (2) statistics for the pooled estimates and the inconsistency index, I (2) . To overcome heterogeneity, subgroup analyses according to slice thickness and lesion size were performed. A total of eight eligible studies, which included a sample size of 252 patients and 1413 brain metastases, were included. The detectability of brain metastases using SE images (89.2 %) was higher than using GRE images (81.6 %; adjusted 84.0 %), but this difference was not statistically significant (p = 0.2385). In subgroup analysis of studies with 1-mm-thick slices and small metastases (<5 mm in diameter), 3-dimensional (3D) SE images demonstrated a higher detectability in comparison to 3D GRE images (93.7 % vs 73.1 % in 1-mm-thick slices; 89.5 % vs 59.4 % for small metastases) (p < 0.0001). Although both SE or GRE images are acceptable for detecting brain metastases, contrast-enhanced 3D SE images using 1-mm-thick slices are preferred for detecting brain metastases, especially small lesions (<5 mm in diameter).

  10. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. New prospects for observing and cataloguing exoplanets in well-detached binaries

    NASA Astrophysics Data System (ADS)

    Schwarz, R.; Funk, B.; Zechner, R.; Bazsó, Á.

    2016-08-01

    This paper is devoted to study the circumstances favourable to detect circumstellar and circumbinary planets in well-detached binary-star systems using eclipse timing variations (ETVs). We investigated the dynamics of well-detached binary star systems with a star separation from 0.5 to 3 au, to determine the probability of the detection of such variations with ground-based telescopes and space telescopes (like former missions CoRoT and Kepler and future space missions Plato, Tess and Cheops). For the chosen star separations both dynamical configurations (circumstellar and circumbinary) may be observable. We performed numerical simulations by using the full three-body problem as dynamical model. The dynamical stability and the ETVs are investigated by computing ETV maps for different masses of the secondary star and the exoplanet (Earth, Neptune and Jupiter size). In addition we changed the planet's and binary's eccentricities. We conclude that many amplitudes of ETVs are large enough to detect exoplanets in binary-star systems. As an application, we prepared statistics of the catalogue of exoplanets in binary star systems which we introduce in this article and compared the statistics with our parameter-space which we used for our calculations. In addition to these statistics of the catalogue we enlarged them by the investigation of well-detached binary star systems from several catalogues and discussed the possibility of further candidates.

  12. Do Biochemical Markers and Apa I Polymorphism in IGF-II Gene Play a Role in the Association of Birth Weight and Later BMI?

    PubMed

    Wu, Junqing; Ren, Jingchao; Li, Yuyan; Wu, Yinjie; Gao, Ersheng

    2013-01-01

    The aim of the study was to explore the mechanisms underlying the association of birth weight with later body mass index (BMI) from the biochemical markers related to metabolism and the Apa I polymorphism in IGF-II gene. A total of 300 children were selected randomly from the Macrosomia Birth Cohort in Wuxi, China. The height and weight were measured and blood samples were collected. Plasma concentrations of 8 biochemical markers were detected. Apa I polymorphism was analyzed by polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP). Biochemical markers were detected for 296 subjects and 271 subjects were genotyped for the Apa I polymorphism. No association was found between birth weight and 8 biochemical markers. In boys, the BMIs of AA, AG and GG genotypes were 16.10 ± 2.24 kg/m(2), 17.40 ± 3.20 kg/m(2), 17.65 ± 2.66 kg/m(2). And there was statistical difference among the three genotypes. But in girls, there was no statistical difference. The birth weights of AA, AG and GG genotypes were 3751.13 ± 492.43 g, 3734.00 ± 456.88 g, 3782.00 ± 461.78 g. And there was no statistical difference among the three genotypes. Biochemical markers are not associated with birth weight. Apa I polymorphism may be related to childhood BMI, but it may be not associated with birth weight. Therefore, biochemical markers and Apa I polymorphism might not play a role in the association of birth weight and BMI.

  13. Effects of peer health education on perception and practice of screening for cervical cancer among urban residential women in south-east Nigeria: a before and after study.

    PubMed

    Mbachu, Chinyere; Dim, Cyril; Ezeoke, Uche

    2017-06-09

    Effective female education on cervical cancer prevention has been shown to increase awareness and uptake of screening. However, sustaining increase in uptake poses a challenge to control efforts. Peer health education has been used as an effective tool for ensuring sustained behavior change. This study was undertaken to assess the effectiveness of peer health education on perception, willingness to screen and uptake of cervical cancer screening by women. A before and after intervention study was undertaken in 2 urban cities in Enugu state, Nigeria among women of reproductive age attending women's meeting in Anglican churches. Multistage sampling was used to select 300 women. Peer health education was provided once monthly for 3 consecutive sessions over a period of 3 months. Data was collected at baseline and after the intervention using pre-tested questionnaires. Descriptive statistics and tests of significance of observed differences and associations were done at p-value of <0.05. Statistical significant difference was observed in participants' individual risk perception for cervical cancer and perception of benefits of early detection through screening. Practice of screening for cervical cancer increased by 6.8% and the observed difference was statistically significant (p = 0.02). This was significantly associated with marital status, level of education, employment status and parity (p < 0.05). Peer health education is an effective strategy for increasing women's perception of benefits of early detection of cervical cancer through screening. It is also effective for increasing their practice of screening for cervical cancer.

  14. Lymphovascular invasion in more than one-quarter of small rectal neuroendocrine tumors

    PubMed Central

    Kwon, Mi Jung; Kang, Ho Suk; Soh, Jae Seung; Lim, Hyun; Kim, Jong Hyeok; Park, Choong Kee; Park, Hye-Rim; Nam, Eun Sook

    2016-01-01

    AIM To identify the frequency, clinicopathological risk factors, and prognostic significance of lymphovascular invasion (LVI) in endoscopically resected small rectal neuroendocrine tumors (NETs). METHODS Between June 2005 and December 2015, 104 cases of endoscopically resected small (≤ 1 cm) rectal NET specimens at Hallym University Sacred Heart Hospital in Korea were retrospectively evaluated. We compared the detected rate of LVI in small rectal NET specimens by two methods: hematoxylin and eosin (H&E) and ancillary immunohistochemical staining (D2-40 and Elastica van Gieson); in addition, LVI detection rate difference between endoscopic procedures were also evaluated. Patient characteristics, prognosis and endoscopic resection results were reviewed by medical charts. RESULTS We observed LVI rates of 25.0% and 27.9% through H&E and ancillary immunohistochemical staining. The concordance rate between H&E and ancillary studies was 81.7% for detection of LVI, which showed statistically strong agreement between two methods (κ = 0.531, P < 0.001). Two endoscopic methods were studied, including endoscopic submucosal resection with a ligation device and endoscopic submucosal dissection, and no statistically significant difference in the LVI detection rate was detected between the two (26.3% and 26.8%, P = 0.955). LVI was associated with large tumor size (> 5 mm, P = 0.007), tumor grade 2 (P = 0.006). Among those factors, tumor grade 2 was the only independent predictive factor for the presence of LVI (HR = 4.195, 95%CI: 1.321-12.692, P = 0.015). No recurrence was observed over 28.8 mo regardless of the presence of LVI. CONCLUSION LVI may be present in a high percentage of small rectal NETs, which may not be associated with short-term prognosis. PMID:27895428

  15. Indicator saturation: a novel approach to detect multiple breaks in geodetic time series.

    NASA Astrophysics Data System (ADS)

    Jackson, L. P.; Pretis, F.; Williams, S. D. P.

    2016-12-01

    Geodetic time series can record long term trends, quasi-periodic signals at a variety of time scales from days to decades, and sudden breaks due to natural or anthropogenic causes. The causes of breaks range from instrument replacement to earthquakes to unknown (i.e. no attributable cause). Furthermore, breaks can be permanent or short-lived and range at least two orders of magnitude in size (mm to 100's mm). To account for this range of possible signal-characteristics requires a flexible time series method that can distinguish between true and false breaks, outliers and time-varying trends. One such method, Indicator Saturation (IS) comes from the field of econometrics where analysing stochastic signals in these terms is a common problem. The IS approach differs from alternative break detection methods by considering every point in the time series as a break until it is demonstrated statistically that it is not. A linear model is constructed with a break function at every point in time, and all but statistically significant breaks are removed through a general-to-specific model selection algorithm for more variables than observations. The IS method is flexible because it allows multiple breaks of different forms (e.g. impulses, shifts in the mean, and changing trends) to be detected, while simultaneously modelling any underlying variation driven by additional covariates. We apply the IS method to identify breaks in a suite of synthetic GPS time series used for the Detection of Offsets in GPS Experiments (DOGEX). We optimise the method to maximise the ratio of true-positive to false-positive detections, which improves estimates of errors in the long term rates of land motion currently required by the GPS community.

  16. Colorectal and interval cancers of the Colorectal Cancer Screening Program in the Basque Country (Spain)

    PubMed Central

    Portillo, Isabel; Arana-Arri, Eunate; Idigoras, Isabel; Bilbao, Isabel; Martínez-Indart, Lorea; Bujanda, Luis; Gutierrez-Ibarluzea, Iñaki

    2017-01-01

    AIM To assess proportions, related conditions and survival of interval cancer (IC). METHODS The programme has a linkage with different clinical databases and cancer registers to allow suitable evaluation. This evaluation involves the detection of ICs after a negative faecal inmunochemical test (FIT), interval cancer FIT (IC-FIT) prior to a subsequent invitation, and the detection of ICs after a positive FIT and confirmatory diagnosis without colorectal cancer (CRC) detected and before the following recommended colonoscopy, IC-colonoscopy. We conducted a retrospective observational study analyzing from January 2009 to December 2015 1193602 invited people onto the Programme (participation rate of 68.6%). RESULTS Two thousand five hundred and eighteen cancers were diagnosed through the programme, 18 cases of IC-colonoscopy were found before the recommended follow-up (43542 colonoscopies performed) and 186 IC-FIT were identified before the following invitation of the 769200 negative FITs. There was no statistically significant relation between the predictor variables of ICs with sex, age and deprivation index, but there was relation between location and stage. Additionally, it was observed that there was less risk when the location was distal rather than proximal (OR = 0.28, 95%CI: 0.20-0.40, P < 0.0001), with no statistical significance when the location was in the rectum as opposed to proximal. When comparing the screen-detected cancers (SCs) with ICs, significant differences in survival were found (P < 0.001); being the 5-years survival for SCs 91.6% and IC-FIT 77.8%. CONCLUSION These findings in a Population Based CRC Screening Programme indicate the need of population-based studies that continue analyzing related factors to improve their detection and reducing harm. PMID:28487610

  17. Effectiveness of teaching International Caries Detection and Assessment System II and its e-learning program to freshman dental students on occlusal caries detection

    PubMed Central

    El-Damanhoury, Hatem M.; Fakhruddin, Kausar Sadia; Awad, Manal A.

    2014-01-01

    Objective: To assess the feasibility of teaching International Caries Detection and Assessment System (ICDAS) II and its e-learning program as tools for occlusal caries detection to freshmen dental students in comparison to dental graduates with 2 years of experience. Materials and Methods: Eighty-four freshmen and 32 dental graduates examined occlusal surfaces of molars/premolars (n = 72) after a lecture and a hands-on workshop. The same procedure was repeated after 1 month following the training with ICDAS II e-learning program. Validation of ICDAS II codes was done histologically. Intra- and inter-examiner reproducibility of ICDAS II severity scores were assessed before and after e-learning using (Fleiss's kappa). Results: The kappa values showed inter-examiner reproducibility ranged from 0.53 (ICDAS II code cut off ≥ 1) to 0.70 (ICDAS II code cut off ≥ 3) by undergraduates and 0.69 (ICDAS II code cut off ≥ 1) to 0.95 (ICDAS II code cut off ≥ 3) by graduates. The inter-examiner reproducibility ranged from 0.64 (ICDAS II code cut off ≥ 1) to 0.89 (ICDAS II code cut off ≥ 3). No statistically significant difference was found between both groups in intra-examiner agreements for assessing ICDAS II codes. A high statistically significant difference (P ≤ 0.01) in correct identification of codes 1, 2, and 4 from before to after e-learning were observed in both groups. The bias indices for the undergraduate group were higher than those of the graduate group. Conclusions: Early exposure of students to ICDAS II is a valuable method of teaching caries detection and its e-learning program significantly improves their caries diagnostic skills. PMID:25512730

  18. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction.

    PubMed

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  19. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  20. A meta-analysis of the published literature on the effectiveness of antimicrobial soaps.

    PubMed

    Montville, Rebecca; Schaffner, Donald W

    2011-11-01

    The goal of this research was to conduct a systematic quantitative analysis of the existing data in the literature in order to determine if there is a difference between antimicrobial and nonantimicrobial soaps and to identify the methodological factors that might affect this difference. Data on hand washing efficacy and experimental conditions (sample size, wash duration, soap quantity, challenge organism, inoculum size, and neutralization method) from published studies were compiled and transferred to a relational database. A total of 25 publications, containing 374 observations, met the study selection criteria. The majority of the studies included fewer than 15 observations with each treatment and included a direct comparison between nonantimicrobial soap and antimicrobial soap. Although differences in efficacy between antimicrobial and nonantimicrobial soap were small (∼0.5-log CFU reduction difference), antimicrobial soap produced consistently statistically significantly greater reductions. This difference was true for any of the antimicrobial compounds investigated where n was >20 (chlorhexidine gluconate, iodophor, triclosan, or povidone). Average log reductions were statistically significantly greater (∼2 log CFU) when either gram-positive or gram-negative transient organisms were deliberately added to hands compared with experiments done with resident hand flora (∼0.5 log CFU). Our findings support the importance of using a high initial inoculum on the hands, well above the detection limit. The inherent variability in hand washing seen in the published literature underscores the importance of using a sufficiently large sample size to detect differences when they occur.

  1. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    NASA Astrophysics Data System (ADS)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  2. Changing pattern of fascioliasis prevalence early in the 3rd millennium in Dakahlia Governorate, Egypt: an update.

    PubMed

    Adarosy, H A; Gad, Y Z; El-Baz, S A; El-Shazly, A M

    2013-04-01

    Fascioliasis is an important food- and water-borne parasitic zoonosis caused by liver flukes of genus Fasciola (Digenea: Fasciolidae) of worldwide distribution. In Egypt, fascioliasis was encountered in nearly all Egyptian Governorates, particularly in the Nile Delta and specifically in Dakahlia. All enrolled cases were subjected to complete history taking, clinical examination, routine investigations and abdominal ultrasonography. Stool analysis, IHA and ELISA were used for fascioliasis diagnosis. Rural areas showed highest prevalence of fascioliasis than urban areas, however, but.without significance (x2= 0.042 & P= 0.837). Regarding human fascioliasis in examined the centers, no statistically significant difference (x2 =2.824 & P=0.243) was detected. Regarding gender variation, the difference was statistically insignificant (x2= 0.166 & P= 0.683). The difference between the age groups was statistically insignificant (x2= 3.882 & P=0.274). Clinically, 7 cases (35%) were asymptomatic and another 13 cases (65%) had different clinical pictures. Abdominal pain, anemia, eosinophilia, and tender hepatomegaly were seen in 70%, 80%, 70%, and 10%; respectively. Of them, 1 1cases showed positive abdominal ultrasonographic findings suggestive of fascioliasis.

  3. Mammographic Screening at Age 40 or 45? What Difference Does It Make? The Potential Impact of American Cancer Society Mammography Screening Guidelines.

    PubMed

    Fancher, Crystal E; Scott, Anthony; Allen, Ahkeel; Dale, Paul

    2017-08-01

    this is a 10-year retrospective chart review evaluating the potential impact of the most recent American Cancer Society mammography screening guidelines which excludes female patients aged 40 to 44 years from routine annual screening mammography. Instead they recommend screening mammography starting at age 45 with the option to begin screening earlier if the patient desires. The institutional cancer registry was systematically searched to identify all women aged 40 to 44 years treated for breast cancer over a 10-year period. These women were separated into two cohorts: screening mammography detected cancer (SMDC) and nonscreening mammography detected cancer (NSMDC). Statistical analysis of the cohorts was performed for lymph node status (SLN), five-year disease-free survival, and five-year overall survival. Women with SMDC had a significantly lower incidence of SLN positive cancer than the NSMDC group, 9 of 63 (14.3%) versus 36 of 81 (44 %; P < 0.001). The five-year disease-free survival for both groups was 84 per cent for SMDC and 80 per cent for NSMDC; this was not statistically significant. The five-year overall survival was statistically significant at 94 per cent for the SMDC group and 80 per cent for the NSMDC group (P < 0.05). This review demonstrates the significance of mammographic screening for early detection and treatment of breast cancer. Mammographic screening in women aged 40 to 44 detected tumors with fewer nodal metastases, resulting in improved survival and reaffirming the need for annual mammographic screening in this age group.

  4. Ultra-low-dose Ultrasound Molecular Imaging for the Detection of Angiogenesis in a Mouse Murine Tumor Model: How Little Can We See?

    PubMed Central

    Wang, Shiying; Herbst, Elizabeth B.; Mauldin, F. William; Diakova, Galina B.; Klibanov, Alexander L.; Hossack, John A.

    2016-01-01

    Objectives The objective of this study is to evaluate the minimum microbubble dose for ultrasound molecular imaging to achieve statistically significant detection of angiogenesis in a mouse model. Materials and Methods The pre-burst minus post-burst method was implemented on a Verasonics ultrasound research scanner using a multi-frame compounding pulse inversion imaging sequence. Biotinylated lipid (distearoyl phosphatidylcholine, DSPC-based) microbubbles that were conjugated with anti-vascular endothelial growth factor 2 (VEGFR2) antibody (MBVEGFR2) or isotype control antibody (MBControl) were injected into mice carrying adenocarcinoma xenografts. Different injection doses ranging from 5 × 104 to 1 × 107 microbubbles per mouse were evaluated to determine the minimum diagnostically effective dose. Results The proposed imaging sequence was able to achieve statistically significant detection (p < 0.05, n = 5) of VEGFR2 in tumors with a minimum MBVEGFR2 injection dose of only 5 × 104 microbubbles per mouse (DSPC at 0.053 ng/g mouse body mass). Non-specific adhesion of MBControl at the same injection dose was negligible. Additionally, the targeted contrast ultrasound signal of MBVEGFR2 decreased with lower microbubble doses, while non-specific adhesion of MBControl increased with higher microbubble doses. Conclusions 5 × 104 microbubbles per animal is now the lowest injection dose on record for ultrasound molecular imaging to achieve statistically significant detection of molecular targets in vivo. Findings in this study provide us with further guidance for future developments of clinically translatable ultrasound molecular imaging applications using a lower dose of microbubbles. PMID:27654582

  5. Empirical Tryout of a New Statistic for Detecting Temporally Inconsistent Responders.

    PubMed

    Kerry, Matthew J

    2018-01-01

    Statistical screening of self-report data is often advised to support the quality of analyzed responses - For example, reduction of insufficient effort responding (IER). One recently introduced index based on Mahalanobis's D for detecting outliers in cross-sectional designs replaces centered scores with difference scores between repeated-measure items: Termed person temporal consistency ( D 2 ptc ). Although the adapted D 2 ptc index demonstrated usefulness in simulation datasets, it has not been applied to empirical data. The current study addresses D 2 ptc 's low uptake by critically appraising its performance across three empirical applications. Independent samples were selected to represent a range of scenarios commonly encountered by organizational researchers. First, in Sample 1, a repeat-measure of future time perspective (FTP) inexperienced working adults (age >40-years; n = 620) indicated that temporal inconsistency was significantly related to respondent age and item reverse-scoring. Second, in repeat-measure of team efficacy aggregations, D 2 ptc successfully detected team-level inconsistency across repeat-performance cycles. Thirdly, the usefulness of the D 2 ptc was examined in an experimental study dataset of subjective life expectancy indicated significantly more stable responding in experimental conditions compared to controls. The empirical findings support D 2 ptc 's flexible and useful application to distinct study designs. Discussion centers on current limitations and further extensions that may be of value to psychologists screening self-report data for strengthening response quality and meaningfulness of inferences from repeated-measures self-reports. Taken together, the findings support the usefulness of the newly devised statistic for detecting IER and other extreme response patterns.

  6. Microscopic Examination of Gallbladder Stones Improves Rate of Detection of Clonorchis sinensis Infection

    PubMed Central

    Ma, Rui-hong; Luo, Xiao-bing; Zheng, Pei-ming; Luo, Zhen-liang; Yang, Liu-qing

    2013-01-01

    To improve the rate of detection of Clonorchis sinensis infection, we compared different specimens from patients with cholecystolithiasis. Feces, gallbladder bile, and gallbladder stones collected from 179 consecutive patients with cholecystolithiasis underwent microscopic examination, and according to the results, 30 egg-positive and 30 egg-negative fecal, gallbladder bile, and gallbladder stone specimens, respectively, underwent real-time fluorescent PCR. The detection rates of eggs in feces, bile, and gallbladder stones were 30.7%, 44.7%, and 69.8%, respectively, and the differences were statistically significant (P < 0.01). The PCR results confirmed that the eggs in the specimens were C. sinensis eggs. Eggs in the feces were “fresh” and in the gallbladder stones were “old.” Microscopic examination of gallbladder stones may improve the detection rates of C. sinensis infection, which is important for developing individualized treatments to prevent the recurrence of gallbladder stones and to prevent the occurrence of severe liver damage and cholangiocarcinoma. PMID:23698535

  7. Single photon detection and signal analysis for high sensitivity dosimetry based on optically stimulated luminescence with beryllium oxide

    NASA Astrophysics Data System (ADS)

    Radtke, J.; Sponner, J.; Jakobi, C.; Schneider, J.; Sommer, M.; Teichmann, T.; Ullrich, W.; Henniger, J.; Kormoll, T.

    2018-01-01

    Single photon detection applied to optically stimulated luminescence (OSL) dosimetry is a promising approach due to the low level of luminescence light and the known statistical behavior of single photon events. Time resolved detection allows to apply a variety of different and independent data analysis methods. Furthermore, using amplitude modulated stimulation impresses time- and frequency information into the OSL light and therefore allows for additional means of analysis. Considering the impressed frequency information, data analysis by using Fourier transform algorithms or other digital filters can be used for separating the OSL signal from unwanted light or events generated by other phenomena. This potentially lowers the detection limits of low dose measurements and might improve the reproducibility and stability of obtained data. In this work, an OSL system based on a single photon detector, a fast and accurate stimulation unit and an FPGA is presented. Different analysis algorithms which are applied to the single photon data are discussed.

  8. Fermi and Swift Gamma-Ray Burst Afterglow Population Studies

    NASA Technical Reports Server (NTRS)

    Racusin, Judith L.; Oates, S. R.; Schady, P.; Burrows, D. N.; dePasquale, M.; Donato, D.; Gehrels, N.; Koch, S.; McEnery, J.; Piran, T.; hide

    2011-01-01

    The new and extreme population of GRBs detected by Fermi -LAT shows several new features in high energy gamma-rays that are providing interesting and unexpected clues into GRB prompt and afterglow emission mechanisms. Over the last 6 years, it has been Swift that has provided the robust dataset of UV/optical and X-ray afterglow observations that opened many windows into components of GRB emission structure. The relationship between the LAT detected GRBs and the well studied, fainter, less energetic GRBs detected by Swift -BAT is only beginning to be explored by multi-wavelength studies. We explore the large sample of GRBs detected by BAT only, BAT and Fermi -GBM, and GBM and LAT, focusing on these samples separately in order to search for statistically significant differences between the populations, using only those GRBs with measured redshifts in order to physically characterize these objects. We disentangle which differences are instrumental selection effects versus intrinsic properties, in order to better understand the nature of the special characteristics of the LAT bursts.

  9. Comparing distinct ground-based lightning location networks covering the Netherlands

    NASA Astrophysics Data System (ADS)

    de Vos, Lotte; Leijnse, Hidde; Schmeits, Maurice; Beekhuis, Hans; Poelman, Dieter; Evers, Läslo; Smets, Pieter

    2015-04-01

    Lightning can be detected using a ground-based sensor network. The Royal Netherlands Meteorological Institute (KNMI) monitors lightning activity in the Netherlands with the so-called FLITS-system; a network combining SAFIR-type sensors. This makes use of Very High Frequency (VHF) as well as Low Frequency (LF) sensors. KNMI has recently decided to replace FLITS by data from a sub-continental network operated by Météorage which makes use of LF sensors only (KNMI Lightning Detection Network, or KLDN). KLDN is compared to the FLITS system, as well as Met Office's long-range Arrival Time Difference (ATDnet), which measures Very Low Frequency (VLF). Special focus lies on the ability to detect Cloud to Ground (CG) and Cloud to Cloud (CC) lightning in the Netherlands. Relative detection efficiency of individual flashes and lightning activity in a more general sense are calculated over a period of almost 5 years. Additionally, the detection efficiency of each system is compared to a ground-truth that is constructed from flashes that are detected by both of the other datasets. Finally, infrasound data is used as a fourth lightning data source for several case studies. Relative performance is found to vary strongly with location and time. As expected, it is found that FLITS detects significantly more CC lightning (because of the strong aptitude of VHF antennas to detect CC), though KLDN and ATDnet detect more CG lightning. We analyze statistics computed over the entire 5-year period, where we look at CG as well as total lightning (CC and CG combined). Statistics that are considered are the Probability of Detection (POD) and the so-called Lightning Activity Detection (LAD). POD is defined as the percentage of reference flashes the system detects compared to the total detections in the reference. LAD is defined as the fraction of system recordings of one or more flashes in predefined area boxes over a certain time period given the fact that the reference detects at least one flash, compared to the total recordings in the reference dataset. The reference for these statistics is taken to be either another dataset, or a dataset consisting of flashes detected by two datasets. Extreme thunderstorm case evaluation shows that the weather alert criterion for severe thunderstorm is reached by FLITS when this is not the case in KLDN and ATD, suggesting the need for KNMI to modify that weather alert criterion when using KLDN.

  10. On the choice of statistical models for estimating occurrence and extinction from animal surveys

    USGS Publications Warehouse

    Dorazio, R.M.

    2007-01-01

    In surveys of natural animal populations the number of animals that are present and available to be detected at a sample location is often low, resulting in few or no detections. Low detection frequencies are especially common in surveys of imperiled species; however, the choice of sampling method and protocol also may influence the size of the population that is vulnerable to detection. In these circumstances, probabilities of animal occurrence and extinction will generally be estimated more accurately if the models used in data analysis account for differences in abundance among sample locations and for the dependence between site-specific abundance and detection. Simulation experiments are used to illustrate conditions wherein these types of models can be expected to outperform alternative estimators of population site occupancy and extinction. ?? 2007 by the Ecological Society of America.

  11. Strengthening the Validity of Population-Based Suicide Rate Comparisons: An Illustration Using U.S. Military and Civilian Data

    ERIC Educational Resources Information Center

    Eaton, Karen M.; Messer, Stephen C.; Garvey Wilson, Abigail L.; Hoge, Charles W.

    2006-01-01

    The objectives of this study were to generate precise estimates of suicide rates in the military while controlling for factors contributing to rate variability such as demographic differences and classification bias, and to develop a simple methodology for the determination of statistically derived thresholds for detecting significant rate…

  12. Statistical Models for Linguistic Variation in Online Media

    ERIC Educational Resources Information Center

    Kulkarni, Vivek

    2017-01-01

    Language on the Internet and social media varies due to time, geography, and social factors. For example, consider an online chat forum where people from different regions across the world interact. In such scenarios, it is important to track and detect regional variation in language. A person from the UK, who is in conversation with someone from…

  13. A Laboratory Exercise for Ecology Teaching: The Use of Photographs in Detecting Dispersion Patterns in Animals

    ERIC Educational Resources Information Center

    Lenton, G. M.

    1975-01-01

    Photographs of a beetle, Catamerus rugosus, were taken at different stages in its life cycle. Students were asked to relate these to real life and carry out a statistical analysis to determine the degree of dispersion of animals. Results demonstrate a change in dispersion throughout the life cycle. (Author/EB)

  14. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  15. Effectiveness of DIAGNOdent in Detecting Root Caries Without Dental Scaling Among Community-dwelling Elderly.

    PubMed

    Zhang, Wen; McGrath, Colman; Lo, Edward C M

    The purpose of this clinical research was to analyze the effectiveness of DIAGNOdent in detecting root caries without dental scaling. The status of 750 exposed, unfilled root surfaces was assessed by visual-tactile examination and DIAGNOdent before and after root scaling. The sensitivity and specificity of different cut-off DIAGNOdent values in diagnosing root caries with reference to visual-tactile criteria were evaluated on those root surfaces without visible plaque/calculus. The DIAGNOdent values from sound and carious root surfaces were compared using the nonparametric Mann-Whitney U-test. The level of statistical significance was set at 0.05. On root surfaces without plaque/calculus, significantly different (p < 0.05) DIAGNOdent readings were obtained from sound root surfaces (12.2 ± 11.1), active carious root surfaces (37.6 ± 31.7) and inactive carious root surfaces (20.9 ± 10.5) before scaling. On root surfaces with visible plaque, DIAGNOdent readings obtained from active carious root surfaces (29.6 ± 20.8) and inactive carious root surfaces (27.0 ± 7.2) were not statistically significantly different (p > 0.05). Furthermore, on root surfaces with visible calculus, all DIAGNOdent readings obtained from sound root surfaces were > 50, which might be misinterpreted as carious. After scaling, the DIAGNOdent readings obtained from sound root surfaces (8.1 ± 11.3), active carious root surfaces (37.9 ± 31.9) and inactive carious root surfaces (24.9 ± 11.5) presented significant differences (p < 0.05). A cut-off value between 10 and 15 yielded the highest combined sensitivity and specificity in detecting root caries on root surfaces without visible plaque/calculus before scaling, but the combined sensitivity and specificity are both around 70%. These findings suggest that on exposed, unfilled root surfaces without visible plaque/calculus, DIAGNOdent can be used as an adjunct to the visual-tactile criteria in detecting root-surface status without pre-treatment by dental scaling.

  16. Influence of Intracanal Materials in Vertical Root Fracture Pathway Detection with Cone-beam Computed Tomography.

    PubMed

    Dutra, Kamile Leonardi; Pachêco-Pereira, Camila; Bortoluzzi, Eduardo Antunes; Flores-Mir, Carlos; Lagravère, Manuel O; Corrêa, Márcio

    2017-07-01

    Investigating the vertical root fracture (VRF) pathway under different clinical scenarios may help to diagnose this condition properly. We aimed to determine the capability and intrareliability of VRF pathway detection through cone-beam computed tomographic (CBCT) imaging as well as analyze the influence of different intracanal and crown materials. VRFs were mechanically induced in 30 teeth, and 4 clinical situations were reproduced in vitro: no filling, gutta-percha, post, and metal crown. A Prexion (San Mateo, CA) 3-dimensional tomographic device was used to generate 104 CBCT scans. The VRF pathway was determined by using landmarks in the Avizo software (Version 8.1; FEI Visualization Sciences Group, Burlington, MA) by 1 observer repeated 3 times. Analysis of variance and post hoc tests were applied to compare groups. Intrareliability demonstrated an excellent agreement (intraclass correlation coefficient mean = 0.93). Descriptive analysis showed that the fracture line measurement was smaller in the post and metal crown groups than in the no-filling and gutta-percha groups. The 1-way analysis of variance test found statistically significant differences among the groups measurements. The Bonferroni correction showed statistically significant differences related to the no-filling and gutta-percha groups versus the post and metal crown groups. The VRF pathway can be accurately detected in a nonfilled tooth using limited field of view CBCT imaging. The presence of gutta-percha generated a low beam hardening artifact that did not hinder the VRF extent. The presence of an intracanal gold post made the fracture line appear smaller than it really was in the sagittal images; in the axial images, a VRF was only detected when the apical third was involved. The presence of a metal crown did not generate additional artifacts on the root surface compared to the intracanal gold post by itself. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Prevalence of Candida albicans and Candida dubliniensis in caries-free and caries-active children in relation to the oral microbiota-a clinical study.

    PubMed

    Al-Ahmad, A; Auschill, T M; Dakhel, R; Wittmer, A; Pelz, K; Heumann, C; Hellwig, E; Arweiler, N B

    2016-11-01

    The correlation between caries and the oral prevalence of Candida spp. in children is contradictory in literature. Thereby, authors focused on Candida albicans as the most isolated Candida species from the oral cavity. Therefore, the aim of the present study was to compare caries-free and caries-bearing children regarding their oral carriage of Candida spp. Twenty-six caries-free (CF group) and 26 caries-active children (CA group) were included into this study. Three different types of specimens were assessed, saliva and plaque, and in the case of caries, infected dentine samples were microbiologically analyzed for aerobic and anaerobic microorganisms and their counts. Special attention was given to the differentiation between C. albicans and Candida dubliniensis. Additionally, different biochemical tests, VITEK 2 (VITEK®2, bioMérieux, Marcy-l'Etoile, France) and 16S and 18S ribosomal DNA (rDNA) sequencing, were applied for identification. The detection of C. albicans did not differ between the CF and CA groups. C. dubliniensis was never detected in any specimen of the CF group, but occurred in one quarter of the CA group (27 % in plaque, 23 % in saliva), thus leading to a statistically significant difference between the two groups (p < 0.05). In six of these cases, C. dubliniensis was detected concomitantly in saliva and plaque and once only in plaque. CA group harbored statistically more Streptococcus mutans than the control group revealing a correlation between S. mutans and C. dubliniensis regarding the caries group. This is the first study reporting a frequent detection of C. dubliniensis in caries-active children, which could have been underestimated so far due to difficulties in differentiation between this yeast species and C. albicans. Microbiological diagnostic-especially of oral Candida species-is an important determinant for identifying etiological factors of dental caries in children.

  18. Three-dimensional volume-rendering technique in the angiographic follow-up of intracranial aneurysms embolized with coils.

    PubMed

    Zhou, Bing; Li, Ming-Hua; Wang, Wu; Xu, Hao-Wen; Cheng, Yong-De; Wang, Jue

    2010-03-01

    The authors conducted a study to evaluate the advantages of a 3D volume-rendering technique (VRT) in follow-up digital subtraction (DS) angiography of coil-embolized intracranial aneurysms. One hundred nine patients with 121 intracranial aneurysms underwent endovascular coil embolization and at least 1 follow-up DS angiography session at the authors' institution. Two neuroradiologists independently evaluated the conventional 2D DS angiograms, rotational angiograms, and 3D VRT images obtained at the interventional procedures and DS angiography follow-up. If multiple follow-up sessions were performed, the final follow-up was mainly considered. The authors compared the 3 techniques for their ability to detect aneurysm remnants (including aneurysm neck and sac remnants) and parent artery stenosis based on the angiographic follow-up. The Kruskal-Wallis test was used for group comparisons, and the kappa test was used to measure interobserver agreement. Statistical analyses were performed using commercially available software. There was a high statistical significance among 2D DS angiography, rotational angiography, and 3D VRT results (X(2) = 9.9613, p = 0.0069) when detecting an aneurysm remnant. Further comparisons disclosed a statistical significance between 3D VRT and rotational angiography (X(2) = 4.9754, p = 0.0257); a high statistical significance between 3D VRT and 2D DS angiography (X(2) = 8.9169, p = 0.0028); and no significant difference between rotational angiography and 2D DS angiography (X(2) = 0.5648, p = 0.4523). There was no statistical significance among the 3 techniques when detecting parent artery stenosis (X(2) = 2.5164, p = 0.2842). One case, in which parent artery stenosis was diagnosed by 2D DS angiography and rotational angiography, was excluded by 3D VRT following observations of multiple views. The kappa test showed good agreement between the 2 observers. The 3D VRT is more sensitive in detecting aneurysm remnants than 2D DS angiography and rotational angiography and is helpful for identifying parent artery stenosis. The authors recommend this technique for the angiographic follow-up of patients with coil-embolized aneurysms.

  19. Quantifying the effect of colorization enhancement on mammogram images

    NASA Astrophysics Data System (ADS)

    Wojnicki, Paul J.; Uyeda, Elizabeth; Micheli-Tzanakou, Evangelia

    2002-04-01

    Current methods of radiological displays provide only grayscale images of mammograms. The limitation of the image space to grayscale provides only luminance differences and textures as cues for object recognition within the image. However, color can be an important and significant cue in the detection of shapes and objects. Increasing detection ability allows the radiologist to interpret the images in more detail, improving object recognition and diagnostic accuracy. Color detection experiments using our stimulus system, have demonstrated that an observer can only detect an average of 140 levels of grayscale. An optimally colorized image can allow a user to distinguish 250 - 1000 different levels, hence increasing potential image feature detection by 2-7 times. By implementing a colorization map, which follows the luminance map of the original grayscale images, the luminance profile is preserved and color is isolated as the enhancement mechanism. The effect of this enhancement mechanism on the shape, frequency composition and statistical characteristics of the Visual Evoked Potential (VEP) are analyzed and presented. Thus, the effectiveness of the image colorization is measured quantitatively using the Visual Evoked Potential (VEP).

  20. Probabilistic Signal Recovery and Random Matrices

    DTIC Science & Technology

    2016-12-08

    applications in statistics , biomedical data analysis, quantization, dimen- sion reduction, and networks science. 1. High-dimensional inference and geometry Our...low-rank approxima- tion, with applications to community detection in networks, Annals of Statistics 44 (2016), 373–400. [7] C. Le, E. Levina, R...approximation, with applications to community detection in networks, Annals of Statistics 44 (2016), 373–400. C. Le, E. Levina, R. Vershynin, Concentration

Top