Sample records for scale statistical analysis

  1. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  2. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  3. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    PubMed Central

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  4. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    PubMed

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  5. [Statistical validity of the Mexican Food Security Scale and the Latin American and Caribbean Food Security Scale].

    PubMed

    Villagómez-Ornelas, Paloma; Hernández-López, Pedro; Carrasco-Enríquez, Brenda; Barrios-Sánchez, Karina; Pérez-Escamilla, Rafael; Melgar-Quiñónez, Hugo

    2014-01-01

    This article validates the statistical consistency of two food security scales: the Mexican Food Security Scale (EMSA) and the Latin American and Caribbean Food Security Scale (ELCSA). Validity tests were conducted in order to verify that both scales were consistent instruments, conformed by independent, properly calibrated and adequately sorted items, arranged in a continuum of severity. The following tests were developed: sorting of items; Cronbach's alpha analysis; parallelism of prevalence curves; Rasch models; sensitivity analysis through mean differences' hypothesis test. The tests showed that both scales meet the required attributes and are robust statistical instruments for food security measurement. This is relevant given that the lack of access to food indicator, included in multidimensional poverty measurement in Mexico, is calculated with EMSA.

  6. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  7. How Attitudes towards Statistics Courses and the Field of Statistics Predicts Statistics Anxiety among Undergraduate Social Science Majors: A Validation of the Statistical Anxiety Scale

    ERIC Educational Resources Information Center

    O'Bryant, Monique J.

    2017-01-01

    The aim of this study was to validate an instrument that can be used by instructors or social scientist who are interested in evaluating statistics anxiety. The psychometric properties of the English version of the Statistical Anxiety Scale (SAS) was examined through a confirmatory factor analysis of scores from a sample of 323 undergraduate…

  8. A scaling procedure for the response of an isolated system with high modal overlap factor

    NASA Astrophysics Data System (ADS)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  9. Statistical analysis of vegetation and stormwater runoff in an urban watershed during summer and winter storms in Portland, Oregon, U.S

    Treesearch

    Geoffrey H. Donovan; David T. Butry; Megan Y. Mao

    2016-01-01

    Past research has examined the effect of urban trees, and other vegetation, on stormwater runoff using hydrological models or small-scale experiments. However, there has been no statistical analysis of the influence of vegetation on runoff in an intact urban watershed, and it is not clear how results from small-scale studies scale up to the city level. Researchers...

  10. Wavelet analysis of polarization maps of polycrystalline biological fluids networks

    NASA Astrophysics Data System (ADS)

    Ushenko, Y. A.

    2011-12-01

    The optical model of human joints synovial fluid is proposed. The statistic (statistic moments), correlation (autocorrelation function) and self-similar (Log-Log dependencies of power spectrum) structure of polarization two-dimensional distributions (polarization maps) of synovial fluid has been analyzed. It has been shown that differentiation of polarization maps of joint synovial fluid with different physiological state samples is expected of scale-discriminative analysis. To mark out of small-scale domain structure of synovial fluid polarization maps, the wavelet analysis has been used. The set of parameters, which characterize statistic, correlation and self-similar structure of wavelet coefficients' distributions of different scales of polarization domains for diagnostics and differentiation of polycrystalline network transformation connected with the pathological processes, has been determined.

  11. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  12. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  13. EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.

    PubMed

    Tong, Xiaoxiao; Bentler, Peter M

    2013-01-01

    Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.

  14. Data series embedding and scale invariant statistics.

    PubMed

    Michieli, I; Medved, B; Ristov, S

    2010-06-01

    Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.

  15. Autocorrelation and cross-correlation in time series of homicide and attempted homicide

    NASA Astrophysics Data System (ADS)

    Machado Filho, A.; da Silva, M. F.; Zebende, G. F.

    2014-04-01

    We propose in this paper to establish the relationship between homicides and attempted homicides by a non-stationary time-series analysis. This analysis will be carried out by Detrended Fluctuation Analysis (DFA), Detrended Cross-Correlation Analysis (DCCA), and DCCA cross-correlation coefficient, ρ(n). Through this analysis we can identify a positive cross-correlation between homicides and attempted homicides. At the same time, looked at from the point of view of autocorrelation (DFA), this analysis can be more informative depending on time scale. For short scale (days), we cannot identify auto-correlations, on the scale of weeks DFA presents anti-persistent behavior, and for long time scales (n>90 days) DFA presents a persistent behavior. Finally, the application of this new type of statistical analysis proved to be efficient and, in this sense, this paper can contribute to a more accurate descriptive statistics of crime.

  16. Multi-scale statistical analysis of coronal solar activity

    DOE PAGES

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  17. Scaling Laws in Canopy Flows: A Wind-Tunnel Analysis

    NASA Astrophysics Data System (ADS)

    Segalini, Antonio; Fransson, Jens H. M.; Alfredsson, P. Henrik

    2013-08-01

    An analysis of velocity statistics and spectra measured above a wind-tunnel forest model is reported. Several measurement stations downstream of the forest edge have been investigated and it is observed that, while the mean velocity profile adjusts quickly to the new canopy boundary condition, the turbulence lags behind and shows a continuous penetration towards the free stream along the canopy model. The statistical profiles illustrate this growth and do not collapse when plotted as a function of the vertical coordinate. However, when the statistics are plotted as function of the local mean velocity (normalized with a characteristic velocity scale), they do collapse, independently of the streamwise position and freestream velocity. A new scaling for the spectra of all three velocity components is proposed based on the velocity variance and integral time scale. This normalization improves the collapse of the spectra compared to existing scalings adopted in atmospheric measurements, and allows the determination of a universal function that provides the velocity spectrum. Furthermore, a comparison of the proposed scaling laws for two different canopy densities is shown, demonstrating that the vertical velocity variance is the most sensible statistical quantity to the characteristics of the canopy roughness.

  18. The Math Problem: Advertising Students' Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Kendrick, Alice

    2013-01-01

    This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…

  19. Scaled test statistics and robust standard errors for non-normal data in covariance structure analysis: a Monte Carlo study.

    PubMed

    Chou, C P; Bentler, P M; Satorra, A

    1991-11-01

    Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.

  20. Statistical Analysis of Small-Scale Magnetic Flux Emergence Patterns: A Useful Subsurface Diagnostic?

    NASA Astrophysics Data System (ADS)

    Lamb, Derek A.

    2016-10-01

    While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.

  1. Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis

    ERIC Educational Resources Information Center

    Wind, Stefanie A.; Engelhard, George, Jr.

    2016-01-01

    Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement…

  2. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. A large-scale perspective on stress-induced alterations in resting-state networks

    NASA Astrophysics Data System (ADS)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  4. Robust Fixed-Structure Control

    DTIC Science & Technology

    1994-10-30

    Deterministic Foundation for Statistical Energy Analysis ," J. Sound Vibr., to appear. 1.96 D. S. Bernstein and S. P. Bhat, "Lyapunov Stability, Semistability...S. Bernstein, "Power Flow, Energy Balance, and Statistical Energy Analysis for Large Scale, Interconnected Systems," Proc. Amer. Contr. Conf., pp

  5. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.

  6. Avalanche Statistics Identify Intrinsic Stellar Processes near Criticality in KIC 8462852

    NASA Astrophysics Data System (ADS)

    Sheikh, Mohammed A.; Weaver, Richard L.; Dahmen, Karin A.

    2016-12-01

    The star KIC8462852 (Tabby's star) has shown anomalous drops in light flux. We perform a statistical analysis of the more numerous smaller dimming events by using methods found useful for avalanches in ferromagnetism and plastic flow. Scaling exponents for avalanche statistics and temporal profiles of the flux during the dimming events are close to mean field predictions. Scaling collapses suggest that this star may be near a nonequilibrium critical point. The large events are interpreted as avalanches marked by modified dynamics, limited by the system size, and not within the scaling regime.

  7. Psychometric Analysis of Role Conflict and Ambiguity Scales in Academia

    ERIC Educational Resources Information Center

    Khan, Anwar; Yusoff, Rosman Bin Md.; Khan, Muhammad Muddassar; Yasir, Muhammad; Khan, Faisal

    2014-01-01

    A comprehensive Psychometric Analysis of Rizzo et al.'s (1970) Role Conflict & Ambiguity (RCA) scales were performed after its distribution among 600 academic staff working in six universities of Pakistan. The reliability analysis includes calculation of Cronbach Alpha Coefficients and Inter-Items statistics, whereas validity was determined by…

  8. A primer on the study of transitory dynamics in ecological series using the scale-dependent correlation analysis.

    PubMed

    Rodríguez-Arias, Miquel Angel; Rodó, Xavier

    2004-03-01

    Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.

  9. Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon

    1997-01-01

    A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.

  10. Statistical analysis of an inter-laboratory comparison of small-scale safety and thermal testing of RDX

    DOE PAGES

    Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; ...

    2014-11-17

    In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.

  11. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  12. Exploring the validity and statistical utility of a racism scale among Black men who have sex with men: a pilot study.

    PubMed

    Smith, William Pastor

    2013-09-01

    The primary purpose of this two-phased study was to examine the structural validity and statistical utility of a racism scale specific to Black men who have sex with men (MSM) who resided in the Washington, DC, metropolitan area and Baltimore, Maryland. Phase I involved pretesting a 10-item racism measure with 20 Black MSM. Based on pretest findings, the scale was adapted into a 21-item racism scale for use in collecting data on 166 respondents in Phase II. Exploratory factor analysis of the 21-item racism scale resulted in a 19-item, two-factor solution. The two factors or subscales were the following: General Racism and Relationships and Racism. Confirmatory factor analysis was used in testing construct validity of the factored racism scale. Specifically, the two racism factors were combined with three homophobia factors into a confirmatory factor analysis model. Based on a summary of the fit indices, both comparative and incremental were equal to .90, suggesting an adequate convergence of the racism and homophobia dimensions into a single social oppression construct. Statistical utility of the two racism subscales was demonstrated when regression analysis revealed that the gay-identified men versus bisexual-identified men in the sample were more likely to experience increased racism within the context of intimate relationships and less likely to be exposed to repeated experiences of general racism. Overall, the findings in this study highlight the importance of continuing to explore the psychometric properties of a racism scale that accounts for the unique psychosocial concerns experienced by Black MSM.

  13. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  14. Ion-Scale Wave Properties and Enhanced Ion Heating across the Magnetopause during Kelvin-Helmholtz Instability

    NASA Astrophysics Data System (ADS)

    Nykyri, K.; Moore, T.; Dimmock, A. P.

    2017-12-01

    In the Earth's magnetosphere, the magnetotail plasma sheet ions are much hotter than in the shocked solar wind. On the dawn-sector, the cold-component ions are more abundant and hotter by 30-40 percent when compared to the dusk sector. Recent statistical studies of the flank magnetopause and magnetosheath have shown that the level of temperature asymmetry of the magnetosheath is unable to account for this, so additional physical mechanisms must be at play, either at the magnetopause or plasma sheet that contribute to this asymmetry. In this study, we perform a statistical analysis on the ion-scale wave properties in the three main plasma regimes common to flank magnetopause boundary crossings when the boundary is unstable to KHI: hot and tenuous magnetospheric, cold and dense magnetosheath and mixed [Hasegawa 2004 et al., 2004]. These statistics of ion-scale wave properties are compared to observations of fast magnetosonic wave modes that have recently been linked to Kelvin-Helmholtz vortex centered ion heating [Moore et al., 2016]. The statistical analysis shows that during KH events there is enhanced non-adiabatic heating calculated during (temporal) ion scale wave intervals when compared to non-KH events.

  15. One Hundred Ways to be Non-Fickian - A Rigorous Multi-Variate Statistical Analysis of Pore-Scale Transport

    NASA Astrophysics Data System (ADS)

    Most, Sebastian; Nowak, Wolfgang; Bijeljic, Branko

    2015-04-01

    Fickian transport in groundwater flow is the exception rather than the rule. Transport in porous media is frequently simulated via particle methods (i.e. particle tracking random walk (PTRW) or continuous time random walk (CTRW)). These methods formulate transport as a stochastic process of particle position increments. At the pore scale, geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Hence, it is important to get a better understanding of the processes at pore scale. For our analysis we track the positions of 10.000 particles migrating through the pore space over time. The data we use come from micro CT scans of a homogeneous sandstone and encompass about 10 grain sizes. Based on those images we discretize the pore structure and simulate flow at the pore scale based on the Navier-Stokes equation. This flow field realistically describes flow inside the pore space and we do not need to add artificial dispersion during the transport simulation. Next, we use particle tracking random walk and simulate pore-scale transport. Finally, we use the obtained particle trajectories to do a multivariate statistical analysis of the particle motion at the pore scale. Our analysis is based on copulas. Every multivariate joint distribution is a combination of its univariate marginal distributions. The copula represents the dependence structure of those univariate marginals and is therefore useful to observe correlation and non-Gaussian interactions (i.e. non-Fickian transport). The first goal of this analysis is to better understand the validity regions of commonly made assumptions. We are investigating three different transport distances: 1) The distance where the statistical dependence between particle increments can be modelled as an order-one Markov process. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks start. 2) The distance where bivariate statistical dependence simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW/CTRW). 3) The distance of complete statistical independence (validity of classical PTRW/CTRW). The second objective is to reveal characteristic dependencies influencing transport the most. Those dependencies can be very complex. Copulas are highly capable of representing linear dependence as well as non-linear dependence. With that tool we are able to detect persistent characteristics dominating transport even across different scales. The results derived from our experimental data set suggest that there are many more non-Fickian aspects of pore-scale transport than the univariate statistics of longitudinal displacements. Non-Fickianity can also be found in transverse displacements, and in the relations between increments at different time steps. Also, the found dependence is non-linear (i.e. beyond simple correlation) and persists over long distances. Thus, our results strongly support the further refinement of techniques like correlated PTRW or correlated CTRW towards non-linear statistical relations.

  16. Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to…

  17. Relationship between Hounsfield unit in CT scan and gray scale in CBCT

    NASA Astrophysics Data System (ADS)

    Kamaruddin, Noorshaida; Rajion, Zainul Ahmad; Yusof, Asilah; Aziz, Mohd Ezane

    2016-12-01

    Cone-beam computed tomography (CBCT) is an imaging system which has advantages over computed tomography (CT). Recently, CBCT has become widely used for oral and maxillofacial imaging. In CT scan, Hounsfield Unit (HU) is proportional to the degree of x-ray attenuation by the tissue. In CBCT, the degree of x-ray attenuation is shown by gray scale (voxel value). The aim of the present (in vitro) study was to investigate the relationship between gray scale in CBCT and HU in CT scan. In this descriptive study, the anthropomorphic head phantom was scanned with CBCT and CT scanner. Gray scales and HUs were detected on images at the crown of the teeth, trabecular and cortical bone of mandible. The images were analyzed to obtain the gray scale value and HU value. The obtained value then used to investigate the relationship between CBCT gray scales and HUs. For the statistical analysis, t-test, Pearson's correlation and regression analysis were used. The differences between the gray scale of CBCT and HU of CT were statistically not significant, whereas the Pearson's correlation coefficients demonstrated a statistically significant correlation between gray scale of CBCT and HU of CT values. Considering the fact that gray scale in CBCT is important in pre assessment evaluation of bone density before implant treatments, it is recommended because of the lower dose and cost compared to CT scan.

  18. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  19. An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.

    ERIC Educational Resources Information Center

    Kay, Robin

    1992-01-01

    Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…

  20. Revised Perturbation Statistics for the Global Scale Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Woodrum, A.

    1975-01-01

    Magnitudes and scales of atmospheric perturbations about the monthly mean for the thermodynamic variables and wind components are presented by month at various latitudes. These perturbation statistics are a revision of the random perturbation data required for the global scale atmospheric model program and are from meteorological rocket network statistical summaries in the 22 to 65 km height range and NASA grenade and pitot tube data summaries in the region up to 90 km. The observed perturbations in the thermodynamic variables were adjusted to make them consistent with constraints required by the perfect gas law and the hydrostatic equation. Vertical scales were evaluated by Buell's depth of pressure system equation and from vertical structure function analysis. Tables of magnitudes and vertical scales are presented for each month at latitude 10, 30, 50, 70, and 90 degrees.

  1. Statistical characterisation of COSMO Sky-Med X-SAR retrieved precipitation fields by scale-invariance analysis

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Mascaro, Giuseppe; Hellies, Matteo; Baldini, Luca; Roberto, Nicoletta

    2013-04-01

    COSMO Sky-Med (CSK) is an important programme of the Italian Space Agency aiming at supporting environmental monitoring and management of exogenous, endogenous and anthropogenic risks through X-band Synthetic Aperture Radar (X-SAR) on board of 4 satellites forming a constellation. Most of typical SAR applications are focused on land or ocean observation. However, X-band SAR can be detect precipitation that results in a specific signature caused by the combination of attenuation of surface returns induced by precipitation and enhancement of backscattering determined by the hydrometeors in the SAR resolution volume. Within CSK programme, we conducted an intercomparison between the statistical properties of precipitation fields derived by CSK SARs and those derived by the CNR Polar 55C (C-band) ground based weather radar located in Rome (Italy). This contribution presents main results of this research which was aimed at the robust characterisation of rainfall statistical properties across different scales by means of scale-invariance analysis and multifractal theory. The analysis was performed on a dataset of more two years of precipitation observations collected by the CNR Polar 55C radar and rainfall fields derived from available images collected by the CSK satellites during intense rainfall events. Scale-invariance laws and multifractal properties were detected on the most intense rainfall events derived from the CNR Polar 55C radar for spatial scales from 4 km to 64 km. The analysis on X-SAR retrieved rainfall fields, although based on few images, leaded to similar results and confirmed the existence of scale-invariance and multifractal properties for scales larger than 4 km. These outcomes encourage investigating SAR methodologies for future development of meteo-hydrological forecasting models based on multifractal theory.

  2. Statistical scaling of pore-scale Lagrangian velocities in natural porous media.

    PubMed

    Siena, M; Guadagnini, A; Riva, M; Bijeljic, B; Pereira Nunes, J P; Blunt, M J

    2014-08-01

    We investigate the scaling behavior of sample statistics of pore-scale Lagrangian velocities in two different rock samples, Bentheimer sandstone and Estaillades limestone. The samples are imaged using x-ray computer tomography with micron-scale resolution. The scaling analysis relies on the study of the way qth-order sample structure functions (statistical moments of order q of absolute increments) of Lagrangian velocities depend on separation distances, or lags, traveled along the mean flow direction. In the sandstone block, sample structure functions of all orders exhibit a power-law scaling within a clearly identifiable intermediate range of lags. Sample structure functions associated with the limestone block display two diverse power-law regimes, which we infer to be related to two overlapping spatially correlated structures. In both rocks and for all orders q, we observe linear relationships between logarithmic structure functions of successive orders at all lags (a phenomenon that is typically known as extended power scaling, or extended self-similarity). The scaling behavior of Lagrangian velocities is compared with the one exhibited by porosity and specific surface area, which constitute two key pore-scale geometric observables. The statistical scaling of the local velocity field reflects the behavior of these geometric observables, with the occurrence of power-law-scaling regimes within the same range of lags for sample structure functions of Lagrangian velocity, porosity, and specific surface area.

  3. Effect of filter type on the statistics of energy transfer between resolved and subfilter scales from a-priori analysis of direct numerical simulations of isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Buzzicotti, M.; Linkmann, M.; Aluie, H.; Biferale, L.; Brasseur, J.; Meneveau, C.

    2018-02-01

    The effects of different filtering strategies on the statistical properties of the resolved-to-subfilter scale (SFS) energy transfer are analysed in forced homogeneous and isotropic turbulence. We carry out a-priori analyses of the statistical characteristics of SFS energy transfer by filtering data obtained from direct numerical simulations with up to 20483 grid points as a function of the filter cutoff scale. In order to quantify the dependence of extreme events and anomalous scaling on the filter, we compare a sharp Fourier Galerkin projector, a Gaussian filter and a novel class of Galerkin projectors with non-sharp spectral filter profiles. Of interest is the importance of Galilean invariance and we confirm that local SFS energy transfer displays intermittency scaling in both skewness and flatness as a function of the cutoff scale. Furthermore, we quantify the robustness of scaling as a function of the filtering type.

  4. A Mokken scale analysis of the peer physical examination questionnaire.

    PubMed

    Vaughan, Brett; Grace, Sandra

    2018-01-01

    Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.

  5. A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume

    NASA Astrophysics Data System (ADS)

    Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration

    2017-11-01

    An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.

  6. Statistical characterization of short wind waves from stereo images of the sea surface

    NASA Astrophysics Data System (ADS)

    Mironov, Alexey; Yurovskaya, Maria; Dulov, Vladimir; Hauser, Danièle; Guérin, Charles-Antoine

    2013-04-01

    We propose a methodology to extract short-scale statistical characteristics of the sea surface topography by means of stereo image reconstruction. The possibilities and limitations of the technique are discussed and tested on a data set acquired from an oceanographic platform at the Black Sea. The analysis shows that reconstruction of the topography based on stereo method is an efficient way to derive non-trivial statistical properties of surface short- and intermediate-waves (say from 1 centimer to 1 meter). Most technical issues pertaining to this type of datasets (limited range of scales, lacunarity of data or irregular sampling) can be partially overcome by appropriate processing of the available points. The proposed technique also allows one to avoid linear interpolation which dramatically corrupts properties of retrieved surfaces. The processing technique imposes that the field of elevation be polynomially detrended, which has the effect of filtering out the large scales. Hence the statistical analysis can only address the small-scale components of the sea surface. The precise cut-off wavelength, which is approximatively half the patch size, can be obtained by applying a high-pass frequency filter on the reference gauge time records. The results obtained for the one- and two-points statistics of small-scale elevations are shown consistent, at least in order of magnitude, with the corresponding gauge measurements as well as other experimental measurements available in the literature. The calculation of the structure functions provides a powerful tool to investigate spectral and statistical properties of the field of elevations. Experimental parametrization of the third-order structure function, the so-called skewness function, is one of the most important and original outcomes of this study. This function is of primary importance in analytical scattering models from the sea surface and was up to now unavailable in field conditions. Due to the lack of precise reference measurements for the small-scale wave field, we could not quantify exactly the accuracy of the retrieval technique. However, it appeared clearly that the obtained accuracy is good enough for the estimation of second-order statistical quantities (such as the correlation function), acceptable for third-order quantities (such as the skwewness function) and insufficient for fourth-order quantities (such as kurtosis). Therefore, the stereo technique in the present stage should not be thought as a self-contained universal tool to characterize the surface statistics. Instead, it should be used in conjunction with other well calibrated but sparse reference measurement (such as wave gauges) for cross-validation and calibration. It then completes the statistical analysis in as much as it provides a snapshot of the three-dimensional field and allows for the evaluation of higher-order spatial statistics.

  7. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  8. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  9. A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety

    ERIC Educational Resources Information Center

    Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin

    2011-01-01

    The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…

  10. Which are the most useful scales for predicting repeat self-harm? A systematic review evaluating risk scales using measures of diagnostic accuracy

    PubMed Central

    Quinlivan, L; Cooper, J; Davies, L; Hawton, K; Gunnell, D; Kapur, N

    2016-01-01

    Objectives The aims of this review were to calculate the diagnostic accuracy statistics of risk scales following self-harm and consider which might be the most useful scales in clinical practice. Design Systematic review. Methods We based our search terms on those used in the systematic reviews carried out for the National Institute for Health and Care Excellence self-harm guidelines (2012) and evidence update (2013), and updated the searches through to February 2015 (CINAHL, EMBASE, MEDLINE, and PsychINFO). Methodological quality was assessed and three reviewers extracted data independently. We limited our analysis to cohort studies in adults using the outcome of repeat self-harm or attempted suicide. We calculated diagnostic accuracy statistics including measures of global accuracy. Statistical pooling was not possible due to heterogeneity. Results The eight papers included in the final analysis varied widely according to methodological quality and the content of scales employed. Overall, sensitivity of scales ranged from 6% (95% CI 5% to 6%) to 97% (CI 95% 94% to 98%). The positive predictive value (PPV) ranged from 5% (95% CI 3% to 9%) to 84% (95% CI 80% to 87%). The diagnostic OR ranged from 1.01 (95% CI 0.434 to 2.5) to 16.3 (95%CI 12.5 to 21.4). Scales with high sensitivity tended to have low PPVs. Conclusions It is difficult to be certain which, if any, are the most useful scales for self-harm risk assessment. No scales perform sufficiently well so as to be recommended for routine clinical use. Further robust prospective studies are warranted to evaluate risk scales following an episode of self-harm. Diagnostic accuracy statistics should be considered in relation to the specific service needs, and scales should only be used as an adjunct to assessment. PMID:26873046

  11. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    PubMed

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  12. Adaptation and validation of the Spanish version of the Dieting Peer Competitiveness Scale to adolescents of both genders.

    PubMed

    Pamies-Aubalat, Lidia; Quiles-Marcos, Yolanda; Núñez-Núñez, Rosa M

    2013-12-01

    This study examined the Dieting Peer Competitiveness Scale; it is an instrument for evaluating this social comparison in young people. This instrumental study has two aims: The objective of the first aim was to present preliminary psychometric data from the Spanish version of the Dieting Peer Competitiveness Scale, including statistical item analysis, research about this instrument's internal structure, and a reliability analysis, from a sample of 1067 secondary school adolescents. The second objective of the study corresponds to confirmatory factor analysis of the scale's internal structure, as well as analysis for evidence of validity from a sample of 1075 adolescents.

  13. Guidelines for Genome-Scale Analysis of Biological Rhythms.

    PubMed

    Hughes, Michael E; Abruzzi, Katherine C; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M Fernanda; Chen, Zheng; Chiu, Joanna C; Cox, Juergen; Crowell, Alexander M; DeBruyne, Jason P; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J; Duffield, Giles E; Dunlap, Jay C; Eckel-Mahan, Kristin; Esser, Karyn A; FitzGerald, Garret A; Forger, Daniel B; Francey, Lauren J; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H; Herzel, Hanspeter; Herzog, Erik D; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J; Hurley, Jennifer M; de la Iglesia, Horacio O; Johnson, Carl; Kay, Steve A; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A; Li, Jiajia; Li, Xiaodong; Liu, Andrew C; Loros, Jennifer J; Martino, Tami A; Menet, Jerome S; Merrow, Martha; Millar, Andrew J; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N; Olmedo, Maria; Nusinow, Dmitri A; Ptáček, Louis J; Rand, David; Reddy, Akhilesh B; Robles, Maria S; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D; Rund, Samuel S C; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J; Storch, Kai-Florian; Takahashi, Joseph S; Ueda, Hiroki R; Wang, Han; Weitz, Charles; Westermark, Pål O; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B

    2017-10-01

    Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding "big data" that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them.

  14. Guidelines for Genome-Scale Analysis of Biological Rhythms

    PubMed Central

    Hughes, Michael E.; Abruzzi, Katherine C.; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M. Fernanda; Chen, Zheng; Chiu, Joanna C.; Cox, Juergen; Crowell, Alexander M.; DeBruyne, Jason P.; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J.; Duffield, Giles E.; Dunlap, Jay C.; Eckel-Mahan, Kristin; Esser, Karyn A.; FitzGerald, Garret A.; Forger, Daniel B.; Francey, Lauren J.; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S.; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H.; Herzel, Hanspeter; Herzog, Erik D.; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J.; Hurley, Jennifer M.; de la Iglesia, Horacio O.; Johnson, Carl; Kay, Steve A.; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A.; Li, Jiajia; Li, Xiaodong; Liu, Andrew C.; Loros, Jennifer J.; Martino, Tami A.; Menet, Jerome S.; Merrow, Martha; Millar, Andrew J.; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N.; Olmedo, Maria; Nusinow, Dmitri A.; Ptáček, Louis J.; Rand, David; Reddy, Akhilesh B.; Robles, Maria S.; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D.; Rund, Samuel S.C.; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J.; Storch, Kai-Florian; Takahashi, Joseph S.; Ueda, Hiroki R.; Wang, Han; Weitz, Charles; Westermark, Pål O.; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B.

    2017-01-01

    Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding “big data” that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them. PMID:29098954

  15. Use of Fouler Transforms to define landscape scales of analysis for disturbances: A case study of thinned and unthinned forest stands

    Treesearch

    J. E. Lundquist; R. A. Sommerfeld

    2002-01-01

    Various disturbances such as disease and management practices cause canopy gaps that change patterns of forest stand structure. This study examined the usefulness of digital image analysis using aerial photos, Fourier Tranforms, and cluster analysis to investigate how different spatial statistics are affected by spatial scale. The specific aims were to: 1) evaluate how...

  16. A study on phenomenology of Dhat syndrome in men in a general medical setting

    PubMed Central

    Prakash, Sathya; Sharan, Pratap; Sood, Mamta

    2016-01-01

    Background: “Dhat syndrome” is believed to be a culture-bound syndrome of the Indian subcontinent. Although many studies have been performed, many have methodological limitations and there is a lack of agreement in many areas. Aims: The aim is to study the phenomenology of “Dhat syndrome” in men and to explore the possibility of subtypes within this entity. Settings and Design: It is a cross-sectional descriptive study conducted at a sex and marriage counseling clinic of a tertiary care teaching hospital in Northern India. Materials and Methods: An operational definition and assessment instrument for “Dhat syndrome” was developed after taking all concerned stakeholders into account and review of literature. It was applied on 100 patients along with socio-demographic profile, Hamilton Depression Rating Scale, Hamilton Anxiety Rating Scale, Mini International Neuropsychiatric Interview, and Postgraduate Institute Neuroticism Scale. Statistical Analysis: For statistical analysis, descriptive statistics, group comparisons, and Pearson's product moment correlations were carried out. Factor analysis and cluster analysis were done to determine the factor structure and subtypes of “Dhat syndrome.” Results: A diagnostic and assessment instrument for “Dhat syndrome” has been developed and the phenomenology in 100 patients has been described. Both the health beliefs scale and associated symptoms scale demonstrated a three-factor structure. The patients with “Dhat syndrome” could be categorized into three clusters based on severity. Conclusions: There appears to be a significant agreement among various stakeholders on the phenomenology of “Dhat syndrome” although some differences exist. “Dhat syndrome” could be subtyped into three clusters based on severity. PMID:27385844

  17. AQAK: A Library Anxiety Scale for Undergraduate Students

    ERIC Educational Resources Information Center

    Anwar, Mumtaz A.; Al-Qallaf, Charlene L.; Al-Kandari, Noriah M.; Al-Ansari, Husain A.

    2012-01-01

    The library environment has drastically changed since 1992 when Bostick's Library Anxiety Scale was developed. This project aimed to develop a scale specifically for undergraduate students. A three-stage study was conducted, using students of Kuwait University. A variety of statistical measures, including factor analysis, were used to process the…

  18. [Psychometric properties of scales of professional competence and treatment by health personnel in outpatient hospital surgeries].

    PubMed

    Bermejo Alegría, Rosa M; Hidalgo Montesinos, M Dolores; Parra Hidalgo, Pedro; Más Castillo, Adelia; Gomis Cebrián, Rafael

    2011-04-01

    The aim of this study was to analyze the psychometric properties of two scales that assess the perceived quality and patient satisfaction with outpatient surgery in the Health Service of Murcia. These scales assess the degree of Professional Competence (PC) and Personnel Treatment (PT). The scales were administered to a sample of 2017 users of outpatient surgery in the Health Service of Murcia during the years 2008 and 2009. Exploratory factor analysis indicates a unidimensional structure for each scale. Internal consistency was adequate: .68 for PC and .75 for PT. The correlation between the PC scale and patients' global satisfaction was positive and statistically significant. The correlation between the PT scale and patients' global satisfaction was also statistically significant. The scales have shown their utility to detect areas of improvement and to plan intervention strategies.

  19. Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment

    ERIC Educational Resources Information Center

    Cui, Ying; Mousavi, Amin

    2015-01-01

    The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…

  20. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  1. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion.

    PubMed

    Gautestad, Arild O

    2012-09-07

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.

  2. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  3. Spatial analyses for nonoverlapping objects with size variations and their application to coral communities.

    PubMed

    Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko

    2014-07-01

    Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  4. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  5. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  6. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  7. Correlation of MRI Visual Scales with Neuropsychological Profile in Mild Cognitive Impairment of Parkinson's Disease.

    PubMed

    Vasconcellos, Luiz Felipe; Pereira, João Santos; Adachi, Marcelo; Greca, Denise; Cruz, Manuela; Malak, Ana Lara; Charchat-Fichman, Helenice; Spitz, Mariana

    2017-01-01

    Few studies have evaluated magnetic resonance imaging (MRI) visual scales in Parkinson's disease-Mild Cognitive Impairment (PD-MCI). We selected 79 PD patients and 92 controls (CO) to perform neurologic and neuropsychological evaluation. Brain MRI was performed to evaluate the following scales: Global Cortical Atrophy (GCA), Fazekas, and medial temporal atrophy (MTA). The analysis revealed that both PD groups (amnestic and nonamnestic) showed worse performance on several tests when compared to CO. Memory, executive function, and attention impairment were more severe in amnestic PD-MCI group. Overall analysis of frequency of MRI visual scales by MCI subtype did not reveal any statistically significant result. Statistically significant inverse correlation was observed between GCA scale and Mini-Mental Status Examination (MMSE), Montreal Cognitive Assessment (MoCA), semantic verbal fluency, Stroop test, figure memory test, trail making test (TMT) B, and Rey Auditory Verbal Learning Test (RAVLT). The MTA scale correlated with Stroop test and Fazekas scale with figure memory test, digit span, and Stroop test according to the subgroup evaluated. Visual scales by MRI in MCI should be evaluated by cognitive domain and might be more useful in more severely impaired MCI or dementia patients.

  8. Planck 2015 results. XVI. Isotropy and statistics of the CMB

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.

  9. Planck 2015 results: XVI. Isotropy and statistics of the CMB

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...

    2016-09-20

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  10. Confirmatory Factor Analysis of the Scales for Diagnosing Attention Deficit Hyperactivity Disorder (SCALES)

    ERIC Educational Resources Information Center

    Ryser, Gail R.; Campbell, Hilary L.; Miller, Brian K.

    2010-01-01

    The diagnostic criteria for attention deficit hyperactivity disorder have evolved over time with current versions of the "Diagnostic and Statistical Manual", (4th edition), text revision, ("DSM-IV-TR") suggesting that two constellations of symptoms may be present alone or in combination. The SCALES instrument for diagnosing attention deficit…

  11. Lévy meets poisson: a statistical artifact may lead to erroneous recategorization of Lévy walk as Brownian motion.

    PubMed

    Gautestad, Arild O

    2013-03-01

    The flow of GPS data on animal space is challenging old paradigms, such as the issue of the scale-free Lévy walk versus scale-specific Brownian motion. Since these movement classes often require different protocols with respect to ecological analyses, further theoretical development in this field is important. I describe central concepts such as scale-specific versus scale-free movement and the difference between mechanistic and statistical-mechanical levels of analysis. Next, I report how a specific sampling scheme may have produced much confusion: a Lévy walk may be wrongly categorized as Brownian motion if the duration of a move, or bout, is used as a proxy for step length and a move is subjectively defined. Hence, the categorization and recategorization of movement class compliance surrounding the Lévy walk controversy may have been based on a statistical artifact. This issue may be avoided by collecting relocations at a fixed rate at a temporal scale that minimizes over- and undersampling.

  12. Inference from the small scales of cosmic shear with current and future Dark Energy Survey data

    DOE PAGES

    MacCrann, N.; Aleksić, J.; Amara, A.; ...

    2016-11-05

    Cosmic shear is sensitive to fluctuations in the cosmological matter density field, including on small physical scales, where matter clustering is affected by baryonic physics in galaxies and galaxy clusters, such as star formation, supernovae feedback and AGN feedback. While muddying any cosmological information that is contained in small scale cosmic shear measurements, this does mean that cosmic shear has the potential to constrain baryonic physics and galaxy formation. We perform an analysis of the Dark Energy Survey (DES) Science Verification (SV) cosmic shear measurements, now extended to smaller scales, and using the Mead et al. 2015 halo model tomore » account for baryonic feedback. While the SV data has limited statistical power, we demonstrate using a simulated likelihood analysis that the final DES data will have the statistical power to differentiate among baryonic feedback scenarios. We also explore some of the difficulties in interpreting the small scales in cosmic shear measurements, presenting estimates of the size of several other systematic effects that make inference from small scales difficult, including uncertainty in the modelling of intrinsic alignment on nonlinear scales, `lensing bias', and shape measurement selection effects. For the latter two, we make use of novel image simulations. While future cosmic shear datasets have the statistical power to constrain baryonic feedback scenarios, there are several systematic effects that require improved treatments, in order to make robust conclusions about baryonic feedback.« less

  13. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    PubMed

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  14. Meta-analysis inside and outside particle physics: two traditions that should converge?

    PubMed

    Baker, Rose D; Jackson, Dan

    2013-06-01

    The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle physics. In this paper, we use statistical models to study two major differences in practice. The first is the usefulness of systematic errors, which physicists are now beginning to quote in addition to statistical errors. The second is whether it is better to treat heterogeneity by scaling up errors as do the Particle Data Group or by adding a random effect as does the rest of the community. Besides fitting models, we derive and use an exact test of the error-scaling hypothesis. We also discuss the other methodological differences between the two streams of meta-analysis. Our conclusion is that systematic errors are not currently very useful and that the conventional random effects model, as routinely used in meta-analysis, has a useful role to play in particle physics. The moral we draw for statisticians is that we should be more willing to explore 'grassroots' areas of statistical application, so that good statistical practice can flow both from and back to the statistical mainstream. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Multilevel Latent Class Analysis for Large-Scale Educational Assessment Data: Exploring the Relation between the Curriculum and Students' Mathematical Strategies

    ERIC Educational Resources Information Center

    Fagginger Auer, Marije F.; Hickendorff, Marian; Van Putten, Cornelis M.; Béguin, Anton A.; Heiser, Willem J.

    2016-01-01

    A first application of multilevel latent class analysis (MLCA) to educational large-scale assessment data is demonstrated. This statistical technique addresses several of the challenges that assessment data offers. Importantly, MLCA allows modeling of the often ignored teacher effects and of the joint influence of teacher and student variables.…

  17. Reynolds number dependence of relative dispersion statistics in isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Sawford, Brian L.; Yeung, P. K.; Hackl, Jason F.

    2008-06-01

    Direct numerical simulation results for a range of relative dispersion statistics over Taylor-scale Reynolds numbers up to 650 are presented in an attempt to observe and quantify inertial subrange scaling and, in particular, Richardson's t3 law. The analysis includes the mean-square separation and a range of important but less-studied differential statistics for which the motion is defined relative to that at time t =0. It seeks to unambiguously identify and quantify the Richardson scaling by demonstrating convergence with both the Reynolds number and initial separation. According to these criteria, the standard compensated plots for these statistics in inertial subrange scaling show clear evidence of a Richardson range but with an imprecise estimate for the Richardson constant. A modified version of the cube-root plots introduced by Ott and Mann [J. Fluid Mech. 422, 207 (2000)] confirms such convergence. It has been used to yield more precise estimates for Richardson's constant g which decrease with Taylor-scale Reynolds numbers over the range of 140-650. Extrapolation to the large Reynolds number limit gives an asymptotic value for Richardson's constant in the range g =0.55-0.57, depending on the functional form used to make the extrapolation.

  18. Likert scales, levels of measurement and the "laws" of statistics.

    PubMed

    Norman, Geoff

    2010-12-01

    Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".

  19. Mini-Mental Status Examination: mixed Rasch model item analysis derived two different cognitive dimensions of the MMSE.

    PubMed

    Schultz-Larsen, Kirsten; Kreiner, Svend; Lomholt, Rikke Kirstine

    2007-03-01

    This study published in two companion papers assesses properties of the Mini-Mental State Examination (MMSE) with the purpose of improving the efficiencies of the methods of screening for cognitive impairment and dementia. An item analysis by conventional and mixed Rasch models was used to explore empirically derived cognitive dimensions of the MMSE, to assess item bias, and to construct diagnostic cut-points. The scores of 1,189 elderly residents were analyzed. Two dimensions of cognitive function, which are statistically and conceptually different from those obtained in previous studies, were derived. The corresponding sum scales were (1) age-correlated MMSE scale (A-MMSE scale: orientation to time, attention/calculation, naming, repetition, and three-stage command) and (2) non-age-correlated MMSE scale (B-MMSE scale: orientation to place, registration, recall, reading, and copying). The "writing" item was not included due to differential effects of age and sex. The analysis also showed that the study sample consisted of two cognitively different groups of elderly. The findings indicate that a two-scale solution is a stable and statistically supported framework for interpreting data obtained by means of the MMSE. Supplementary analyses are presented in the companion paper to explore the performance of this item response theory calibration as a screening test for dementia.

  20. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  1. Evidence, temperature, and the laws of thermodynamics.

    PubMed

    Vieland, Veronica J

    2014-01-01

    A primary purpose of statistical analysis in genetics is the measurement of the strength of evidence for or against hypotheses. As with any type of measurement, a properly calibrated measurement scale is necessary if we want to be able to meaningfully compare degrees of evidence across genetic data sets, across different types of genetic studies and/or across distinct experimental modalities. In previous papers in this journal and elsewhere, my colleagues and I have argued that geneticists ought to care about the scale on which statistical evidence is measured, and we have proposed the Kelvin temperature scale as a template for a context-independent measurement scale for statistical evidence. Moreover, we have claimed that, mathematically speaking, evidence and temperature may be one and the same thing. On first blush, this might seem absurd. Temperature is a property of systems following certain laws of nature (in particular, the 1st and 2nd Law of Thermodynamics) involving very physical quantities (e.g., energy) and processes (e.g., mechanical work). But what do the laws of thermodynamics have to do with statistical systems? Here I address that question. © 2014 S. Karger AG, Basel.

  2. Decoding the spatial signatures of multi-scale climate variability - a climate network perspective

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Jajcay, N.; Wiedermann, M.; Ekhtiari, N.; Palus, M.

    2017-12-01

    During the last years, the application of complex networks as a versatile tool for analyzing complex spatio-temporal data has gained increasing interest. Establishing this approach as a new paradigm in climatology has already provided valuable insights into key spatio-temporal climate variability patterns across scales, including novel perspectives on the dynamics of the El Nino Southern Oscillation or the emergence of extreme precipitation patterns in monsoonal regions. In this work, we report first attempts to employ network analysis for disentangling multi-scale climate variability. Specifically, we introduce the concept of scale-specific climate networks, which comprises a sequence of networks representing the statistical association structure between variations at distinct time scales. For this purpose, we consider global surface air temperature reanalysis data and subject the corresponding time series at each grid point to a complex-valued continuous wavelet transform. From this time-scale decomposition, we obtain three types of signals per grid point and scale - amplitude, phase and reconstructed signal, the statistical similarity of which is then represented by three complex networks associated with each scale. We provide a detailed analysis of the resulting connectivity patterns reflecting the spatial organization of climate variability at each chosen time-scale. Global network characteristics like transitivity or network entropy are shown to provide a new view on the (global average) relevance of different time scales in climate dynamics. Beyond expected trends originating from the increasing smoothness of fluctuations at longer scales, network-based statistics reveal different degrees of fragmentation of spatial co-variability patterns at different scales and zonal shifts among the key players of climate variability from tropically to extra-tropically dominated patterns when moving from inter-annual to decadal scales and beyond. The obtained results demonstrate the potential usefulness of systematically exploiting scale-specific climate networks, whose general patterns are in line with existing climatological knowledge, but provide vast opportunities for further quantifications at local, regional and global scales that are yet to be explored.

  3. Ion-Scale Wave Properties and Enhanced Ion Heating Across the Low-Latitude Boundary Layer During Kelvin-Helmholtz Instability

    NASA Astrophysics Data System (ADS)

    Moore, T. W.; Nykyri, K.; Dimmock, A. P.

    2017-11-01

    In the Earth's magnetosphere, the magnetotail plasma sheet ions are much hotter than in the shocked solar wind. On the dawn sector, the cold-component ions are more abundant and hotter by 30-40% when compared to the dusk sector. Recent statistical studies of the flank magnetopause and magnetosheath have shown that the level of temperature asymmetry of the magnetosheath is unable to account for this, so additional physical mechanisms must be at play, either at the magnetopause or plasma sheet that contributes to this asymmetry. In this study, we perform a statistical analysis on the ion-scale wave properties in the three main plasma regimes common to flank magnetopause boundary crossings when the boundary is unstable to Kelvin-Helmholtz instability (KHI): hot and tenuous magnetospheric, cold and dense magnetosheath, and mixed (Hasegawa et al., 2004). These statistics of ion-scale wave properties are compared to observations of fast magnetosonic wave modes that have recently been linked to Kelvin-Helmholtz (KH) vortex centered ion heating (Moore et al., 2016). The statistical analysis shows that during KH events there is enhanced nonadiabatic heating calculated during ion scale wave intervals when compared to non-KH events. This suggests that during KH events there is more free energy for ion-scale wave generation, which in turn can heat ions more effectively when compared to cases when KH waves are absent. This may contribute to the dawn favored temperature asymmetry of the plasma sheet; recent studies suggest KH waves favor the dawn flank during Parker-Spiral interplanetary magnetic field.

  4. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  5. Aerosol, a health hazard during ultrasonic scaling: A clinico-microbiological study.

    PubMed

    Singh, Akanksha; Shiva Manjunath, R G; Singla, Deepak; Bhattacharya, Hirak S; Sarkar, Arijit; Chandra, Neeraj

    2016-01-01

    Ultrasonic scaling is a routinely used treatment to remove plaque and calculus from tooth surfaces. These scalers use water as a coolant which is splattered during the vibration of the tip. The splatter when mixed with saliva and plaque of the patients causes the aerosol highly infectious and acts as a major risk factor for transmission of the disease. In spite of necessary protection, sometimes, the operator might get infected because of the infectious nature of the splatter. To evaluate the aerosol contamination produced during ultrasonic scaling by the help of microbiological analysis. This clinico-microbiological study consisted of twenty patients. Two agar plates were used for each patient; the first was kept at the center of the operatory room 20 min before the treatment while the second agar plate was kept 40 cm away from the patient's chest during the treatment. Both the agar plates were sent for microbiological analysis. The statistical analysis was done with the help of STATA 11.0 (StataCorp. 2013. Stata Statistical Software, Release 13. College Station, TX: StataCorp LP, 4905 Lakeway Drive College Station, Texas, USA). Statistical software was used for data analysis and the P < 0.001 was considered to be statistically significant. The results for bacterial count were highly significant when compared before and during the treatment. The Gram staining showed the presence of Staphylococcus and Streptococcus species in high numbers. The aerosols and splatters produced during dental procedures have the potential to spread infection to dental personnel. Therefore, proper precautions should be taken to minimize the risk of infection to the operator.

  6. Effects of perceived parental attitudes on children's views of smoking.

    PubMed

    Ozturk, Candan; Kahraman, Seniha; Bektas, Murat

    2013-01-01

    The aim of this study was to examine the effects of perceived parental attitudes on children's discernment of cigarettes. The study sample consisted of 250 children attending grades 6, 7 and 8. Data were collected via a socio-demographic survey questionnaire, the Parental Attitude Scale (PAS) and the Decisional Balance Scale (DBS). Data analysis covered percentages, medians, one-way analysis of variance (ANOVA) and post-hoc tests using a statistical package. There were 250 participants; 117 were male, 133 were female. The mean age was 13.1 ± 0.98 for the females and 13.3 ± 0.88 for the males. A statistically significant difference was found in the children's mean scores for 'pros' subscale on the Decisional Balance Scale (DBS) according to perceived parental attitudes (F=3.172, p=0.025). There were no statistically significant differences in the DBS 'cons' subscale scores by perceived parental attitudes. It was determined that while perceived parental attitudes affect children's views on advantages of smoking, they have no effect on children's views on its disadvantages.

  7. High-resolution Statistics of Solar Wind Turbulence at Kinetic Scales Using the Magnetospheric Multiscale Mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.

    Using data from the Magnetospheric Multiscale (MMS) and Cluster missions obtained in the solar wind, we examine second-order and fourth-order structure functions at varying spatial lags normalized to ion inertial scales. The analysis includes direct two-spacecraft results and single-spacecraft results employing the familiar Taylor frozen-in flow approximation. Several familiar statistical results, including the spectral distribution of energy, and the sale-dependent kurtosis, are extended down to unprecedented spatial scales of ∼6 km, approaching electron scales. The Taylor approximation is also confirmed at those small scales, although small deviations are present in the kinetic range. The kurtosis is seen to attain verymore » high values at sub-proton scales, supporting the previously reported suggestion that monofractal behavior may be due to high-frequency plasma waves at kinetic scales.« less

  8. A statistical forecast model using the time-scale decomposition technique to predict rainfall during flood period over the middle and lower reaches of the Yangtze River Valley

    NASA Astrophysics Data System (ADS)

    Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao

    2018-04-01

    In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  10. Psychometric properties of the Portuguese version of place attachment scale for youth in residential care.

    PubMed

    Magalhães, Eunice; Calheiros, María M

    2015-01-01

    Although the significant scientific advances on place attachment literature, no instruments exist specifically developed or adapted to residential care. 410 adolescents (11 - 18 years old) participated in this study. The place attachment scale evaluates five dimensions: Place identity, Place dependence, Institutional bonding, Caregivers bonding and Friend bonding. Data analysis included descriptive statistics, content validity, construct validity (Confirmatory Factor Analysis), concurrent validity with correlations with satisfaction with life and with institution, and reliability evidences. The relationship with individual characteristics and placement length was also verified. Content validity analysis revealed that more than half of the panellists perceive all the items as relevant to assess the construct in residential care. The structure with five dimensions revealed good fit statistics and concurrent validity evidences were found, with significant correlations with satisfaction with life and with the institution. Acceptable values of internal consistence and specific gender differences were found. The preliminary psychometric properties of this scale suggest it potential to be used with youth in care.

  11. Additive scales in degenerative disease--calculation of effect sizes and clinical judgment.

    PubMed

    Riepe, Matthias W; Wilkinson, David; Förstl, Hans; Brieden, Andreas

    2011-12-16

    The therapeutic efficacy of an intervention is often assessed in clinical trials by scales measuring multiple diverse activities that are added to produce a cumulative global score. Medical communities and health care systems subsequently use these data to calculate pooled effect sizes to compare treatments. This is done because major doubt has been cast over the clinical relevance of statistically significant findings relying on p values with the potential to report chance findings. Hence in an aim to overcome this pooling the results of clinical studies into a meta-analyses with a statistical calculus has been assumed to be a more definitive way of deciding of efficacy. We simulate the therapeutic effects as measured with additive scales in patient cohorts with different disease severity and assess the limitations of an effect size calculation of additive scales which are proven mathematically. We demonstrate that the major problem, which cannot be overcome by current numerical methods, is the complex nature and neurobiological foundation of clinical psychiatric endpoints in particular and additive scales in general. This is particularly relevant for endpoints used in dementia research. 'Cognition' is composed of functions such as memory, attention, orientation and many more. These individual functions decline in varied and non-linear ways. Here we demonstrate that with progressive diseases cumulative values from multidimensional scales are subject to distortion by the limitations of the additive scale. The non-linearity of the decline of function impedes the calculation of effect sizes based on cumulative values from these multidimensional scales. Statistical analysis needs to be guided by boundaries of the biological condition. Alternatively, we suggest a different approach avoiding the error imposed by over-analysis of cumulative global scores from additive scales.

  12. Statistical geometric affinity in human brain electric activity

    NASA Astrophysics Data System (ADS)

    Chornet-Lurbe, A.; Oteo, J. A.; Ros, J.

    2007-05-01

    The representation of the human electroencephalogram (EEG) records by neurophysiologists demands standardized time-amplitude scales for their correct conventional interpretation. In a suite of graphical experiments involving scaling affine transformations we have been able to convert electroencephalogram samples corresponding to any particular sleep phase and relaxed wakefulness into each other. We propound a statistical explanation for that finding in terms of data collapse. As a sequel, we determine characteristic time and amplitude scales and outline a possible physical interpretation. An analysis for characteristic times based on lacunarity is also carried out as well as a study of the synchrony between left and right EEG channels.

  13. Robust Detection of Examinees with Aberrant Answer Changes

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2015-01-01

    The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at testing organizations. However, AC data has an uncertainty caused by technological or human factors. Therefore, existing statistics (e.g., number of wrong-to-right ACs) used to detect examinees…

  14. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis

    PubMed Central

    Lin, Johnny; Bentler, Peter M.

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511

  15. Statistical analysis of corn yields responding to climate variability at various spatio-temporal resolutions

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Lin, T.

    2017-12-01

    Rain-fed corn production systems are subject to sub-seasonal variations of precipitation and temperature during the growing season. As each growth phase has varied inherent physiological process, plants necessitate different optimal environmental conditions during each phase. However, this temporal heterogeneity towards climate variability alongside the lifecycle of crops is often simplified and fixed as constant responses in large scale statistical modeling analysis. To capture the time-variant growing requirements in large scale statistical analysis, we develop and compare statistical models at various spatial and temporal resolutions to quantify the relationship between corn yield and weather factors for 12 corn belt states from 1981 to 2016. The study compares three spatial resolutions (county, agricultural district, and state scale) and three temporal resolutions (crop growth phase, monthly, and growing season) to characterize the effects of spatial and temporal variability. Our results show that the agricultural district model together with growth phase resolution can explain 52% variations of corn yield caused by temperature and precipitation variability. It provides a practical model structure balancing the overfitting problem in county specific model and weak explanation power in state specific model. In US corn belt, precipitation has positive impact on corn yield in growing season except for vegetative stage while extreme heat attains highest sensitivity from silking to dough phase. The results show the northern counties in corn belt area are less interfered by extreme heat but are more vulnerable to water deficiency.

  16. Effects of Heterogeniety on Spatial Pattern Analysis of Wild Pistachio Trees in Zagros Woodlands, Iran

    NASA Astrophysics Data System (ADS)

    Erfanifard, Y.; Rezayan, F.

    2014-10-01

    Vegetation heterogeneity biases second-order summary statistics, e.g., Ripley's K-function, applied for spatial pattern analysis in ecology. Second-order investigation based on Ripley's K-function and related statistics (i.e., L- and pair correlation function g) is widely used in ecology to develop hypothesis on underlying processes by characterizing spatial patterns of vegetation. The aim of this study was to demonstrate effects of underlying heterogeneity of wild pistachio (Pistacia atlantica Desf.) trees on the second-order summary statistics of point pattern analysis in a part of Zagros woodlands, Iran. The spatial distribution of 431 wild pistachio trees was accurately mapped in a 40 ha stand in the Wild Pistachio & Almond Research Site, Fars province, Iran. Three commonly used second-order summary statistics (i.e., K-, L-, and g-functions) were applied to analyse their spatial pattern. The two-sample Kolmogorov-Smirnov goodness-of-fit test showed that the observed pattern significantly followed an inhomogeneous Poisson process null model in the study region. The results also showed that heterogeneous pattern of wild pistachio trees biased the homogeneous form of K-, L-, and g-functions, demonstrating a stronger aggregation of the trees at the scales of 0-50 m than actually existed and an aggregation at scales of 150-200 m, while regularly distributed. Consequently, we showed that heterogeneity of point patterns may bias the results of homogeneous second-order summary statistics and we also suggested applying inhomogeneous summary statistics with related null models for spatial pattern analysis of heterogeneous vegetations.

  17. Low energy peripheral scaling in nucleon-nucleon scattering and uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Ruiz Simo, I.; Amaro, J. E.; Ruiz Arriola, E.; Navarro Pérez, R.

    2018-03-01

    We analyze the peripheral structure of the nucleon-nucleon interaction for LAB energies below 350 MeV. To this end we transform the scattering matrix into the impact parameter representation by analyzing the scaled phase shifts (L + 1/2) δ JLS (p) and the scaled mixing parameters (L + 1/2)ɛ JLS (p) in terms of the impact parameter b = (L + 1/2)/p. According to the eikonal approximation, at large angular momentum L these functions should become an universal function of b, independent on L. This allows to discuss in a rather transparent way the role of statistical and systematic uncertainties in the different long range components of the two-body potential. Implications for peripheral waves obtained in chiral perturbation theory interactions to fifth order (N5LO) or from the large body of NN data considered in the SAID partial wave analysis are also drawn from comparing them with other phenomenological high-quality interactions, constructed to fit scattering data as well. We find that both N5LO and SAID peripheral waves disagree more than 5σ with the Granada-2013 statistical analysis, more than 2σ with the 6 statistically equivalent potentials fitting the Granada-2013 database and about 1σ with the historical set of 13 high-quality potentials developed since the 1993 Nijmegen analysis.

  18. Clinimetrics and clinical psychometrics: macro- and micro-analysis.

    PubMed

    Tomba, Elena; Bech, Per

    2012-01-01

    Clinimetrics was introduced three decades ago to specify the domain of clinical markers in clinical medicine (indexes or rating scales). In this perspective, clinical validity is the platform for selecting the various indexes or rating scales (macro-analysis). Psychometric validation of these indexes or rating scales is the measuring aspect (micro-analysis). Clinical judgment analysis by experienced psychiatrists is included in the macro-analysis and the item response theory models are especially preferred in the micro-analysis when using the total score as a sufficient statistic. Clinical assessment tools covering severity of illness scales, prognostic measures, issues of co-morbidity, longitudinal assessments, recovery, stressors, lifestyle, psychological well-being, and illness behavior have been identified. The constructive dialogue in clinimetrics between clinical judgment and psychometric validation procedures is outlined for generating developments of clinical practice in psychiatry. Copyright © 2012 S. Karger AG, Basel.

  19. Care dependency of hospitalized children: testing the Care Dependency Scale for Paediatrics in a cross-cultural comparison.

    PubMed

    Tork, Hanan; Dassen, Theo; Lohrmann, Christa

    2009-02-01

    This paper is a report of a study to examine the psychometric properties of the Care Dependency Scale for Paediatrics in Germany and Egypt and to compare the care dependency of school-age children in both countries. Cross-cultural differences in care dependency of older adults have been documented in the literature, but little is known about the differences and similarities with regard to children's care dependency in different cultures. A convenience sample of 258 school-aged children from Germany and Egypt participated in the study in 2005. The reliability of the Care Dependency Scale for Paediatrics was assessed in terms of internal consistency and interrater reliability. Factor analysis (principal component analysis) was employed to verify the construct validity. A Visual Analogue Scale was used to investigate the criterion-related validity. Good internal consistency was detected both for the Arabic and German versions. Factor analysis revealed one factor for both versions. A Pearson's correlation between the Care Dependency Scale for Paediatrics and Visual Analogue Scale was statistically significant for both versions indicating criterion-related validity. Statistically significant differences between the participants were detected regarding the mean sum score on the Care Dependency Scale for Paediatrics. The Care Dependency Scale for Paediatrics is a reliable and valid tool for assessing the care dependency of children and is recommended for assessing the care dependency of children from different ethnic origins. Differences in care dependency between German and Egyptian children were detected, which might be due to cultural differences.

  20. Statistical analysis of kinetic energy entrainment in a model wind turbine array boundary layer

    NASA Astrophysics Data System (ADS)

    Cal, Raul Bayoan; Hamilton, Nicholas; Kang, Hyung-Suk; Meneveau, Charles

    2012-11-01

    For large wind farms, kinetic energy must be entrained from the flow above the wind turbines to replenish wakes and enable power extraction in the array. Various statistical features of turbulence causing vertical entrainment of mean-flow kinetic energy are studied using hot-wire velocimetry data taken in a model wind farm in a scaled wind tunnel experiment. Conditional statistics and spectral decompositions are employed to characterize the most relevant turbulent flow structures and determine their length-scales. Sweep and ejection events are shown to be the largest contributors to the vertical kinetic energy flux, although their relative contribution depends upon the location in the wake. Sweeps are shown to be dominant in the region above the wind turbine array. A spectral analysis of the data shows that large scales of the flow, about the size of the rotor diameter in length or larger, dominate the vertical entrainment. The flow is more incoherent below the array, causing decreased vertical fluxes there. The results show that improving the rate of vertical kinetic energy entrainment into wind turbine arrays is a standing challenge and would require modifying the large-scale structures of the flow. This work was funded in part by the National Science Foundation (CBET-0730922, CBET-1133800 and CBET-0953053).

  1. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  2. Rolling-Element Fatigue Testing and Data Analysis - A Tutorial

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.

    2011-01-01

    In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.

  3. Meta-analysis of prediction model performance across multiple studies: Which scale helps ensure between-study normality for the C-statistic and calibration measures?

    PubMed

    Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D

    2017-01-01

    If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.

  4. How large a dataset should be in order to estimate scaling exponents and other statistics correctly in studies of solar wind turbulence

    NASA Astrophysics Data System (ADS)

    Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.

    2009-12-01

    Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  6. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  7. Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects.

    PubMed

    Ho, Andrew D; Yu, Carol C

    2015-06-01

    Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.

  8. Quantitative Analysis of Repertoire-Scale Immunoglobulin Properties in Vaccine-Induced B-Cell Responses

    DTIC Science & Technology

    2017-05-10

    repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1

  9. Operational Consequences of Literacy Gap.

    DTIC Science & Technology

    1980-05-01

    Comprehension Scores on the Safety and Sanitation Content 37 11. Statistics on Experimental Groups’ Performance by Sex and Content 37 12. Analysis of...Variance of Experimental Groups by Sex and Content 38 13. Mean Comprehension Scores Broken Down by Content, Subject RGL and Reading Time 39 14. Analysis...ratings along a scale of difficulty which parallels the school grade scale. Burkett (1975) and Klare (1963; 1974-1975) provide summaries of the extensive

  10. Introduction to bioinformatics.

    PubMed

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  11. Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis

    PubMed Central

    Wind, Stefanie A.; Engelhard, George

    2015-01-01

    Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement properties, such as invariance, in contexts where response processes are not well understood. Because rater-mediated assessments involve complex interactions among many variables, including assessment contexts, student artifacts, rubrics, individual rater characteristics, and others, rater-assigned scores are suitable candidates for Mokken scale analysis. The purposes of this study are to describe a suite of indices that can be used to explore the psychometric quality of data from rater-mediated assessments and to illustrate the substantive interpretation of Mokken-based statistics and displays in this context. Techniques that are commonly used in polytomous applications of Mokken scaling are adapted for use with rater-mediated assessments, with a focus on the substantive interpretation related to individual raters. Overall, the findings suggest that indices of rater monotonicity, rater scalability, and invariant rater ordering based on Mokken scaling provide diagnostic information at the level of individual raters related to the requirements for invariant measurement. These Mokken-based indices serve as an additional suite of diagnostic tools for exploring the quality of data from rater-mediated assessments that can supplement rating quality indices based on parametric models. PMID:29795883

  12. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  13. A collaborative sequential meta-analysis of individual patient data from randomized trials of endovascular therapy and tPA vs. tPA alone for acute ischemic stroke: ThRombEctomy And tPA (TREAT) analysis: statistical analysis plan for a sequential meta-analysis performed within the VISTA-Endovascular collaboration.

    PubMed

    MacIsaac, Rachael L; Khatri, Pooja; Bendszus, Martin; Bracard, Serge; Broderick, Joseph; Campbell, Bruce; Ciccone, Alfonso; Dávalos, Antoni; Davis, Stephen M; Demchuk, Andrew; Diener, Hans-Christoph; Dippel, Diederik; Donnan, Geoffrey A; Fiehler, Jens; Fiorella, David; Goyal, Mayank; Hacke, Werner; Hill, Michael D; Jahan, Reza; Jauch, Edward; Jovin, Tudor; Kidwell, Chelsea S; Liebeskind, David; Majoie, Charles B; Martins, Sheila Cristina Ouriques; Mitchell, Peter; Mocco, J; Muir, Keith W; Nogueira, Raul; Saver, Jeffrey L; Schonewille, Wouter J; Siddiqui, Adnan H; Thomalla, Götz; Tomsick, Thomas A; Turk, Aquilla S; White, Philip; Zaidat, Osama; Lees, Kennedy R

    2015-10-01

    Endovascular treatment has been shown to restore blood flow effectively. Second-generation medical devices such as stent retrievers are now showing overwhelming efficacy in clinical trials, particularly in conjunction with intravenous recombinant tissue plasminogen activator. This statistical analysis plan utilizing a novel, sequential approach describes a prospective, individual patient data analysis of endovascular therapy in conjunction with intravenous recombinant tissue plasminogen activator agreed upon by the Thrombectomy and Tissue Plasminogen Activator Collaborative Group. This protocol will specify the primary outcome for efficacy, as 'favorable' outcome defined by the ordinal distribution of the modified Rankin Scale measured at three-months poststroke, but with modified Rankin Scales 5 and 6 collapsed into a single category. The primary analysis will aim to answer the questions: 'what is the treatment effect of endovascular therapy with intravenous recombinant tissue plasminogen activator compared to intravenous tissue plasminogen activator alone on full scale modified Rankin Scale at 3 months?' and 'to what extent do key patient characteristics influence the treatment effect of endovascular therapy?'. Key secondary outcomes include effect of endovascular therapy on death within 90 days; analyses of modified Rankin Scale using dichotomized methods; and effects of endovascular therapy on symptomatic intracranial hemorrhage. Several secondary analyses will be considered as well as expanding patient cohorts to intravenous recombinant tissue plasminogen activator-ineligible patients, should data allow. This collaborative meta-analysis of individual participant data from randomized trials of endovascular therapy vs. control in conjunction with intravenous thrombolysis will demonstrate the efficacy and generalizability of endovascular therapy with intravenous thrombolysis as a concomitant medication. © 2015 World Stroke Organization.

  14. Assessing Attitudes towards Statistics among Medical Students: Psychometric Properties of the Serbian Version of the Survey of Attitudes Towards Statistics (SATS)

    PubMed Central

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Background Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students’ attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. Methods The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Results Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051–0.078) was below the suggested value of ≤0.08. Cronbach’s alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Conclusion Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students’ attitudes towards statistics in the Serbian educational context. PMID:25405489

  15. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS).

    PubMed

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078) was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the Serbian educational context.

  16. Global sensitivity analysis of multiscale properties of porous materials

    NASA Astrophysics Data System (ADS)

    Um, Kimoon; Zhang, Xuan; Katsoulakis, Markos; Plechac, Petr; Tartakovsky, Daniel M.

    2018-02-01

    Ubiquitous uncertainty about pore geometry inevitably undermines the veracity of pore- and multi-scale simulations of transport phenomena in porous media. It raises two fundamental issues: sensitivity of effective material properties to pore-scale parameters and statistical parameterization of Darcy-scale models that accounts for pore-scale uncertainty. Homogenization-based maps of pore-scale parameters onto their Darcy-scale counterparts facilitate both sensitivity analysis (SA) and uncertainty quantification. We treat uncertain geometric characteristics of a hierarchical porous medium as random variables to conduct global SA and to derive probabilistic descriptors of effective diffusion coefficients and effective sorption rate. Our analysis is formulated in terms of solute transport diffusing through a fluid-filled pore space, while sorbing to the solid matrix. Yet it is sufficiently general to be applied to other multiscale porous media phenomena that are amenable to homogenization.

  17. Joint lavage associated with triamcinolone hexacetonide injection in knee osteoarthritis: a randomized double-blind controlled study.

    PubMed

    Parmigiani, Leandro; Furtado, Rita N V; Lopes, Roberta V; Ribeiro, Luiza H C; Natour, Jamil

    2010-11-01

    Compare the medium-term effectiveness and tolerance between joint lavage (JL) in combination with triamcinolone hexacetonide (TH) intra-articular injection (IAI) and IAI with TH alone for treatment of primary osteoarthritis (OA) of the knee. A randomized, double-blind, controlled study was carried out on 60 patients with primary OA of the knee, randomized into two intervention groups: JL/TH group, joint lavage in combination with TH intra-articular injection and TH group, TH intra-articular injection. Patients were followed for 12 weeks by a blind observer using the following outcome measurements: visual analogue scale for pain at rest and in movement, goniometry, WOMAC, Lequesne's index, timed 50-ft walk, perception of improvement, Likert scale for improvement assessment, use of nonsteroidal anti-inflammatory drugs and analgesics, and local side effects. There were no statistical differences in the inter-group analysis for any of the variables studied over the 12-week period. Although both groups demonstrated statistical improvement in the intra-group evaluation (except for Likert scale according to patient and the use of anti-inflammatory drugs). In the Kellgren-Lawrence scale (KL) 2 and 3 sub-analysis, there was a statistical difference regarding joint flexion among patients classified as KL 2, favoring the TH group (p=0.03). For the KL 3 patients, there were statistical differences favoring the JL/TH group regarding Lequesne (p=0.021), WOMAC pain score (p=0.01), and Likert scale according to the patient (p=0.028) and the physician (p=0.034). The combination of joint lavage and IAI with TH was not more effective than IAI with TH alone in the treatment of primary OA of the knee. However, KL 3 patients may receive a major benefit from this combination.

  18. Effects of forcing time scale on the simulated turbulent flows and turbulent collision statistics of inertial particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosa, B., E-mail: bogdan.rosa@imgw.pl; Parishani, H.; Department of Earth System Science, University of California, Irvine, California 92697-3100

    2015-01-15

    In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [“An examination of forcing in direct numerical simulations of turbulence,” Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynoldsmore » number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate.« less

  19. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-07-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

  20. Development of the Research Competencies Scale

    ERIC Educational Resources Information Center

    Swank, Jacqueline M.; Lambie, Glenn W.

    2016-01-01

    The authors present the development of the Research Competencies Scale (RCS). The purpose of this article is threefold: (a) present a rationale for the RCS, (b) review statistical analysis procedures used in developing the RCS, and (c) offer implications for counselor education, the enhancement of scholar-researchers, and future research.

  1. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion

    PubMed Central

    Gautestad, Arild O.

    2012-01-01

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456

  2. Development and validity of a scale to measure workplace culture of health.

    PubMed

    Kwon, Youngbum; Marzec, Mary L; Edington, Dee W

    2015-05-01

    To describe the development of and test the validity and reliability of the Workplace Culture of Health (COH) scale. Exploratory factor analysis and confirmatory factor analysis were performed on data from a health care organization (N = 627). To verify the factor structure, confirmatory factor analysis was performed on a second data set from a medical equipment manufacturer (N = 226). The COH scale included a structure of five orthogonal factors: senior leadership and polices, programs and rewards, quality assurance, supervisor support, and coworker support. With regard to construct validity (convergent and discriminant) and reliability, two different US companies showed the same factorial structure, satisfactory fit statistics, and suitable internal and external consistency. The COH scale represents a reliable and valid scale to assess the workplace environment and culture for supporting health.

  3. Inferring Small Scale Dynamics from Aircraft Measurements of Tracers

    NASA Technical Reports Server (NTRS)

    Sparling, L. C.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The millions of ER-2 and DC-8 aircraft measurements of long-lived tracers in the Upper Troposphere/Lower Stratosphere (UT/LS) hold enormous potential as a source of statistical information about subgrid scale dynamics. Extracting this information however can be extremely difficult because the measurements are made along a 1-D transect through fields that are highly anisotropic in all three dimensions. Some of the challenges and limitations posed by both the instrumentation and platform are illustrated within the context of the problem of using the data to obtain an estimate of the dissipation scale. This presentation will also include some tutorial remarks about the conditional and two-point statistics used in the analysis.

  4. Effects of a finite outer scale on the measurement of atmospheric-turbulence statistics with a Hartmann wave-front sensor.

    PubMed

    Feng, Shen; Wenhan, Jiang

    2002-06-10

    Phase-structure and aperture-averaged slope-correlated functions with a finite outer scale are derived based on the Taylor hypothesis and a generalized spectrum, such as the von Kármán modal. The effects of the finite outer scale on measuring and determining the character of atmospheric-turbulence statistics are shown especially for an approximately 4-m class telescope and subaperture. The phase structure function and atmospheric coherent length based on the Kolmogorov model are approximations of the formalism we have derived. The analysis shows that it cannot be determined whether the deviation from the power-law parameter of Kolmogorov turbulence is caused by real variations of the spectrum or by the effect of the finite outer scale.

  5. Self-similarity of waiting times in fracture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niccolini, G.; Bosia, F.; Carpinteri, A.

    2009-08-15

    Experimental and numerical results are presented for a fracture experiment carried out on a fiber-reinforced element under flexural loading, and a statistical analysis is performed for acoustic emission waiting-time distributions. By an optimization procedure, a recently proposed scaling law describing these distributions for different event magnitude scales is confirmed by both experimental and numerical data, thus reinforcing the idea that fracture of heterogeneous materials has scaling properties similar to those found for earthquakes. Analysis of the different scaling parameters obtained for experimental and numerical data leads us to formulate the hypothesis that the type of scaling function obtained depends onmore » the level of correlation among fracture events in the system.« less

  6. Statistical Analysis of Large Scale Structure by the Discrete Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Pando, Jesus

    1997-10-01

    The discrete wavelet transform (DWT) is developed as a general statistical tool for the study of large scale structures (LSS) in astrophysics. The DWT is used in all aspects of structure identification including cluster analysis, spectrum and two-point correlation studies, scale-scale correlation analysis and to measure deviations from Gaussian behavior. The techniques developed are demonstrated on 'academic' signals, on simulated models of the Lymanα (Lyα) forests, and on observational data of the Lyα forests. This technique can detect clustering in the Ly-α clouds where traditional techniques such as the two-point correlation function have failed. The position and strength of these clusters in both real and simulated data is determined and it is shown that clusters exist on scales as large as at least 20 h-1 Mpc at significance levels of 2-4 σ. Furthermore, it is found that the strength distribution of the clusters can be used to distinguish between real data and simulated samples even where other traditional methods have failed to detect differences. Second, a method for measuring the power spectrum of a density field using the DWT is developed. All common features determined by the usual Fourier power spectrum can be calculated by the DWT. These features, such as the index of a power law or typical scales, can be detected even when the samples are geometrically complex, the samples are incomplete, or the mean density on larger scales is not known (the infrared uncertainty). Using this method the spectra of Ly-α forests in both simulated and real samples is calculated. Third, a method for measuring hierarchical clustering is introduced. Because hierarchical evolution is characterized by a set of rules of how larger dark matter halos are formed by the merging of smaller halos, scale-scale correlations of the density field should be one of the most sensitive quantities in determining the merging history. We show that these correlations can be completely determined by the correlations between discrete wavelet coefficients on adjacent scales and at nearly the same spatial position, Cj,j+12/cdot2. Scale-scale correlations on two samples of the QSO Ly-α forests absorption spectra are computed. Lastly, higher order statistics are developed to detect deviations from Gaussian behavior. These higher order statistics are necessary to fully characterize the Ly-α forests because the usual 2nd order statistics, such as the two-point correlation function or power spectrum, give inconclusive results. It is shown how this technique takes advantage of the locality of the DWT to circumvent the central limit theorem. A non-Gaussian spectrum is defined and this spectrum reveals not only the magnitude, but the scales of non-Gaussianity. When applied to simulated and observational samples of the Ly-α clouds, it is found that different popular models of structure formation have different spectra while two, independent observational data sets, have the same spectra. Moreover, the non-Gaussian spectra of real data sets are significantly different from the spectra of various possible random samples. (Abstract shortened by UMI.)

  7. A General Procedure to Assess the Internal Structure of a Noncognitive Measure--The Student360 Insight Program (S360) Time Management Scale. Research Report. ETS RR-11-42

    ERIC Educational Resources Information Center

    Ling, Guangming; Rijmen, Frank

    2011-01-01

    The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…

  8. Scaling and stochastic cascade properties of NEMO oceanic simulations and their potential value for GCM evaluation and downscaling

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie

    2014-09-01

    Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.

  9. Multi objective climate change impact assessment using multi downscaled climate scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-04-01

    Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.

  10. Utilizing Wavelet Analysis to assess hydrograph change in northwestern North America

    NASA Astrophysics Data System (ADS)

    Tang, W.; Carey, S. K.

    2017-12-01

    Historical streamflow data in the mountainous regions of northwestern North America suggest that changes flows are driven by warming temperature, declining snowpack and glacier extent, and large-scale teleconnections. However, few sites exist that have robust long-term records for statistical analysis, and pervious research has focussed on high and low-flow indices along with trend analysis using Mann-Kendal test and other similar approaches. Furthermore, there has been less emphasis on ascertaining the drivers of change in changes in shape of the streamflow hydrograph compared with traditional flow metrics. In this work, we utilize wavelet analysis to evaluate changes in hydrograph characteristics for snowmelt driven rivers in northwestern North America across a range of scales. Results suggest that wavelets can be used to detect a lengthening and advancement of freshet with a corresponding decline in peak flows. Furthermore, the gradual transition of flows from nival to pluvial regimes in more southerly catchments is evident in the wavelet spectral power through time. This method of change detection is challenged by evaluating the statistical significance of changes in wavelet spectra as related to hydrograph form, yet ongoing work seeks to link these patters to driving weather and climate along with larger scale teleconnections.

  11. Statistical analysis of mesoscale rainfall: Dependence of a random cascade generator on large-scale forcing

    NASA Technical Reports Server (NTRS)

    Over, Thomas, M.; Gupta, Vijay K.

    1994-01-01

    Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three datasets. The data show that this dependence can be captured by a one-to-one function. Since the large-scale average rain rate can be diagnosed from the large-scale dynamics, this relationship demonstrates an important linkage between the large-scale atmospheric dynamics and the statistical cascade theory of mesoscale rainfall. Potential application of this research to parameterization of runoff from the land surface and regional flood frequency analysis is briefly discussed, and open problems for further research are presented.

  12. THE TRANSLATION, VALIDATION AND CULTURAL ADAPTATION OF FUNCTIONAL ASSESSMENT OF CHRONIC ILLNESS THERAPY - SPIRITUAL WELL-BEING 12 (FACIT-SP12) SCALE IN GREEK LANGUAGE.

    PubMed

    Fradelos, Evangelos C; Tzavella, Foteini; Koukia, Evmorfia; Tsaras, Konstantinos; Papathanasiou, Ioanna V; Aroni, Adamantia; Alikari, Victoria; Ralli, Maria; Bredle, Jason; Zyga, Sofia

    2016-06-01

    According to World Health Organization (WHO), spirituality is an important domain of quality of life especially in terminal, life threatens chronic diseases. For many people spirituality and religion are not just very important dimensions of their existence, but also a source of support that contributes to wellbeing and coping with everyday difficulties of life. Aim of the study was the translation of the Facit Spiritual Well Being Scale (Facit-Sp12) in Greek language and the validation of the scale for the Greek population. The Facit-Sp12 questionnaire is an anonymous self-administered questionnaire that contains twelve, four point Likert scale, closed questions (0=Not at all, 1=A little bit, 2=Some-what, 3=Quite a bit, 4=Very Much). The questionnaire was translated into Greek language and then back translated in the English in order to be checked for any inconsistencies. The sample of the study was 183 chronic kidney disease patients, undergoing hemodialysis. Exploratory factor analysis, with principal components analysis with Varimax rotation was performed for checking the construct validity of the questionnaire. The test-retest reliability and the internal consistency were also examined. Statistical analysis performed by the use of SPSS 21.0. Statistical significance level was set at p=0.05. The final Greek version of the questionnaire includes all of the twelve questions. The mean age of the participants was 61.81±13.9. Three factors were exported from the statistical analysis. The Cronbach-α coefficient was 0.77 for the total questionnaire and for each subscale was 0.70 for "meaning", 0.73 for "peace" and 0.87 for "faith". Between the three subscales "meaning" had the highest score (mean 12.49, SD=2.865). The Facit Spiritual Wellbeing Scale-Facit-Sp12, is a valuable and reliable questionnaire of three dimensions that can be used for assessing spirituality and spiritual wellbeing in Greek population.

  13. Best Phd thesis Prize: Statistical analysis of ALFALFA galaxies: insights in galaxy

    NASA Astrophysics Data System (ADS)

    Papastergis, E.

    2013-09-01

    We use the rich dataset of local universe galaxies detected by the ALFALFA 21cm survey to study the statistical properties of gas-bearing galaxies. In particular, we measure the number density of galaxies as a function of their baryonic mass ("baryonic mass function") and rotational velocity ("velocity width function"), and we characterize their clustering properties ("two-point correlation function"). These statistical distributions are determined by both the properties of dark matter on small scales, as well as by the complex baryonic processes through which galaxies form over cosmic time. We interpret the ALFALFA measurements with the aid of publicly available cosmological N-body simulations and we present some key results related to galaxy formation and small-scale cosmology.

  14. Methods for evaluating temporal groundwater quality data and results of decadal-scale changes in chloride, dissolved solids, and nitrate concentrations in groundwater in the United States, 1988-2010

    USGS Publications Warehouse

    Lindsey, Bruce D.; Rupert, Michael G.

    2012-01-01

    Decadal-scale changes in groundwater quality were evaluated by the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. Samples of groundwater collected from wells during 1988-2000 - a first sampling event representing the decade ending the 20th century - were compared on a pair-wise basis to samples from the same wells collected during 2001-2010 - a second sampling event representing the decade beginning the 21st century. The data set consists of samples from 1,236 wells in 56 well networks, representing major aquifers and urban and agricultural land-use areas, with analytical results for chloride, dissolved solids, and nitrate. Statistical analysis was done on a network basis rather than by individual wells. Although spanning slightly more or less than a 10-year period, the two-sample comparison between the first and second sampling events is referred to as an analysis of decadal-scale change based on a step-trend analysis. The 22 principal aquifers represented by these 56 networks account for nearly 80 percent of the estimated withdrawals of groundwater used for drinking-water supply in the Nation. Well networks where decadal-scale changes in concentrations were statistically significant were identified using the Wilcoxon-Pratt signed-rank test. For the statistical analysis of chloride, dissolved solids, and nitrate concentrations at the network level, more than half revealed no statistically significant change over the decadal period. However, for networks that had statistically significant changes, increased concentrations outnumbered decreased concentrations by a large margin. Statistically significant increases of chloride concentrations were identified for 43 percent of 56 networks. Dissolved solids concentrations increased significantly in 41 percent of the 54 networks with dissolved solids data, and nitrate concentrations increased significantly in 23 percent of 56 networks. At least one of the three - chloride, dissolved solids, or nitrate - had a statistically significant increase in concentration in 66 percent of the networks. Statistically significant decreases in concentrations were identified in 4 percent of the networks for chloride, 2 percent of the networks for dissolved solids, and 9 percent of the networks for nitrate. A larger percentage of urban land-use networks had statistically significant increases in chloride, dissolved solids, and nitrate concentrations than agricultural land-use networks. In order to assess the magnitude of statistically significant changes, the median of the differences between constituent concentrations from the first full-network sampling event and those from the second full-network sampling event was calculated using the Turnbull method. The largest median decadal increases in chloride concentrations were in networks in the Upper Illinois River Basin (67 mg/L) and in the New England Coastal Basins (34 mg/L), whereas the largest median decadal decrease in chloride concentrations was in the Upper Snake River Basin (1 mg/L). The largest median decadal increases in dissolved solids concentrations were in networks in the Rio Grande Valley (260 mg/L) and the Upper Illinois River Basin (160 mg/L). The largest median decadal decrease in dissolved solids concentrations was in the Apalachicola-Chattahoochee-Flint River Basin (6.0 mg/L). The largest median decadal increases in nitrate as nitrogen (N) concentrations were in networks in the South Platte River Basin (2.0 mg/L as N) and the San Joaquin-Tulare Basins (1.0 mg/L as N). The largest median decadal decrease in nitrate concentrations was in the Santee River Basin and Coastal Drainages (0.63 mg/L). The magnitude of change in networks with statistically significant increases typically was much larger than the magnitude of change in networks with statistically significant decreases. The magnitude of change was greatest for chloride in the urban land-use networks and greatest for dissolved solids and nitrate in the agricultural land-use networks. Analysis of data from all networks combined indicated statistically significant increases for chloride, dissolved solids, and nitrate. Although chloride, dissolved solids, and nitrate concentrations were typically less than the drinking-water standards and guidelines, a statistical test was used to determine whether or not the proportion of samples exceeding the drinking-water standard or guideline changed significantly between the first and second full-network sampling events. The proportion of samples exceeding the U.S. Environmental Protection Agency (USEPA) Secondary Maximum Contaminant Level for dissolved solids (500 milligrams per liter) increased significantly between the first and second full-network sampling events when evaluating all networks combined at the national level. Also, for all networks combined, the proportion of samples exceeding the USEPA Maximum Contaminant Level (MCL) of 10 mg/L as N for nitrate increased significantly. One network in the Delmarva Peninsula had a significant increase in the proportion of samples exceeding the MCL for nitrate. A subset of 261 wells was sampled every other year (biennially) to evaluate decadal-scale changes using a time-series analysis. The analysis of the biennial data set showed that changes were generally similar to the findings from the analysis of decadal-scale change that was based on a step-trend analysis. Because of the small number of wells in a network with biennial data (typically 4-5 wells), the time-series analysis is more useful for understanding water-quality responses to changes in site-specific conditions rather than as an indicator of the change for the entire network.

  15. Development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale

    ERIC Educational Resources Information Center

    Mullen, Patrick R.; Lambie, Glenn W.; Conley, Abigail H.

    2014-01-01

    The authors present the development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale (ELICSES). The purpose of this article is threefold: (a) present a rationale for the ELICSES, (b) review statistical analysis procedures used to develop the ELICSES, and (c) offer implications for future research and counselor education.

  16. The Theory of Intelligence and Its Measurement

    ERIC Educational Resources Information Center

    Jensen, A. R.

    2011-01-01

    Mental chronometry (MC) studies cognitive processes measured by time. It provides an absolute, ratio scale. The limitations of instrumentation and statistical analysis caused the early studies in MC to be eclipsed by the "paper-and-pencil" psychometric tests started by Binet. However, they use an age-normed, rather than a ratio scale, which…

  17. A Psychometric Investigation of the Marlowe-Crowne Social Desirability Scale Using Rasch Measurement

    ERIC Educational Resources Information Center

    Seol, Hyunsoo

    2007-01-01

    The author used Rasch measurement to examine the reliability and validity of 382 Korean university students' scores on the Marlowe-Crowne Social Desirability Scale (MCSDS; D. P. Crowne and D. Marlowe, 1960). Results revealed that item-fit statistics and principal component analysis with standardized residuals provide evidence of MCSDS'…

  18. Structure of Student Time Management Scale (STMS)

    ERIC Educational Resources Information Center

    Balamurugan, M.

    2013-01-01

    With the aim of constructing a Student Time Management Scale (STMS), the initial version was administered and data were collected from 523 standard eleventh students. (Mean age = 15.64). The data obtained were subjected to Reliability and Factor analysis using PASW Statistical software version 18. From 42 items 14 were dropped, resulting in the…

  19. Multi-Scale Surface Descriptors

    PubMed Central

    Cipriano, Gregory; Phillips, George N.; Gleicher, Michael

    2010-01-01

    Local shape descriptors compactly characterize regions of a surface, and have been applied to tasks in visualization, shape matching, and analysis. Classically, curvature has be used as a shape descriptor; however, this differential property characterizes only an infinitesimal neighborhood. In this paper, we provide shape descriptors for surface meshes designed to be multi-scale, that is, capable of characterizing regions of varying size. These descriptors capture statistically the shape of a neighborhood around a central point by fitting a quadratic surface. They therefore mimic differential curvature, are efficient to compute, and encode anisotropy. We show how simple variants of mesh operations can be used to compute the descriptors without resorting to expensive parameterizations, and additionally provide a statistical approximation for reduced computational cost. We show how these descriptors apply to a number of uses in visualization, analysis, and matching of surfaces, particularly to tasks in protein surface analysis. PMID:19834190

  20. Field-Scale Soil Moisture Observations in Irrigated Agriculture Fields Using the Cosmic-ray Neutron Rover

    NASA Astrophysics Data System (ADS)

    Franz, T. E.; Avery, W. A.; Finkenbiner, C. E.; Wang, T.; Brocca, L.

    2014-12-01

    Approximately 40% of global food production comes from irrigated agriculture. With the increasing demand for food even greater pressures will be placed on water resources within these systems. In this work we aimed to characterize the spatial and temporal patterns of soil moisture at the field-scale (~500 m) using the newly developed cosmic-ray neutron rover near Waco, NE. Here we mapped soil moisture of 144 quarter section fields (a mix of maize, soybean, and natural areas) each week during the 2014 growing season (May to September). The 11 x11 km study domain also contained 3 stationary cosmic-ray neutron probes for independent validation of the rover surveys. Basic statistical analysis of the domain indicated a strong inverted parabolic relationship between the mean and variance of soil moisture. The relationship between the mean and higher order moments were not as strong. Geostatistical analysis indicated the range of the soil moisture semi-variogram was significantly shorter during periods of heavy irrigation as compared to non-irrigated periods. Scaling analysis indicated strong power law behavior between the variance of soil moisture and averaging area with minimal dependence of mean soil moisture on the slope of the power law function. Statistical relationships derived from the rover dataset offer a novel set of observations that will be useful in: 1) calibrating and validating land surface models, 2) calibrating and validating crop models, 3) soil moisture covariance estimates for statistical downscaling of remote sensing products such as SMOS and SMAP, and 4) provide center-pivot scale mean soil moisture data for optimal irrigation timing and volume amounts.

  1. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    DOE PAGES

    Mohamed, Mamdouh S.; Larson, Bennett C.; Tischler, Jonathan Z.; ...

    2015-05-18

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoreticalmore » analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kr ner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.« less

  2. Analysis of a genetically structured variance heterogeneity model using the Box-Cox transformation.

    PubMed

    Yang, Ye; Christensen, Ole F; Sorensen, Daniel

    2011-02-01

    Over recent years, statistical support for the presence of genetic factors operating at the level of the environmental variance has come from fitting a genetically structured heterogeneous variance model to field or experimental data in various species. Misleading results may arise due to skewness of the marginal distribution of the data. To investigate how the scale of measurement affects inferences, the genetically structured heterogeneous variance model is extended to accommodate the family of Box-Cox transformations. Litter size data in rabbits and pigs that had previously been analysed in the untransformed scale were reanalysed in a scale equal to the mode of the marginal posterior distribution of the Box-Cox parameter. In the rabbit data, the statistical evidence for a genetic component at the level of the environmental variance is considerably weaker than that resulting from an analysis in the original metric. In the pig data, the statistical evidence is stronger, but the coefficient of correlation between additive genetic effects affecting mean and variance changes sign, compared to the results in the untransformed scale. The study confirms that inferences on variances can be strongly affected by the presence of asymmetry in the distribution of data. We recommend that to avoid one important source of spurious inferences, future work seeking support for a genetic component acting on environmental variation using a parametric approach based on normality assumptions confirms that these are met.

  3. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.

  4. A note on statistical analysis of shape through triangulation of landmarks

    PubMed Central

    Rao, C. Radhakrishna

    2000-01-01

    In an earlier paper, the author jointly with S. Suryawanshi proposed statistical analysis of shape through triangulation of landmarks on objects. It was observed that the angles of the triangles are invariant to scaling, location, and rotation of objects. No distinction was made between an object and its reflection. The present paper provides the methodology of shape discrimination when reflection is also taken into account and makes suggestions for modifications to be made when some of the landmarks are collinear. PMID:10737780

  5. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  6. Developing the Evaluation Scale to Determine the Impact of Body Language in an Argument: Reliability & Validity Analysis

    ERIC Educational Resources Information Center

    Karadag, Engin; Caliskan, Nihat; Yesil, Rustu

    2008-01-01

    In this research, it is aimed to develop a scale to observe the body language which is used during an argument. A sample group of 266 teacher candidates study at the departments of Class, Turkish or Social Sciences at the Faculty of Education was used in this study. A logical and statistical approach was pursued during the development of scale. An…

  7. Disaster Metrics: Evaluation of de Boer's Disaster Severity Scale (DSS) Applied to Earthquakes.

    PubMed

    Bayram, Jamil D; Zuabi, Shawki; McCord, Caitlin M; Sherak, Raphael A G; Hsu, Edberdt B; Kelen, Gabor D

    2015-02-01

    Quantitative measurement of the medical severity following multiple-casualty events (MCEs) is an important goal in disaster medicine. In 1990, de Boer proposed a 13-point, 7-parameter scale called the Disaster Severity Scale (DSS). Parameters include cause, duration, radius, number of casualties, nature of injuries, rescue time, and effect on surrounding community. Hypothesis This study aimed to examine the reliability and dimensionality (number of salient themes) of de Boer's DSS scale through its application to 144 discrete earthquake events. A search for earthquake events was conducted via National Oceanic and Atmospheric Administration (NOAA) and US Geological Survey (USGS) databases. Two experts in the field of disaster medicine independently reviewed and assigned scores for parameters that had no data readily available (nature of injuries, rescue time, and effect on surrounding community), and differences were reconciled via consensus. Principle Component Analysis was performed using SPSS Statistics for Windows Version 22.0 (IBM Corp; Armonk, New York USA) to evaluate the reliability and dimensionality of the DSS. A total of 144 individual earthquakes from 2003 through 2013 were identified and scored. Of 13 points possible, the mean score was 6.04, the mode = 5, minimum = 4, maximum = 11, and standard deviation = 2.23. Three parameters in the DSS had zero variance (ie, the parameter received the same score in all 144 earthquakes). Because of the zero contribution to variance, these three parameters (cause, duration, and radius) were removed to run the statistical analysis. Cronbach's alpha score, a coefficient of internal consistency, for the remaining four parameters was found to be robust at 0.89. Principle Component Analysis showed uni-dimensional characteristics with only one component having an eigenvalue greater than one at 3.17. The 4-parameter DSS, however, suffered from restriction of scoring range on both parameter and scale levels. Jan de Boer's DSS in its 7-parameter format fails to hold statistically in a dataset of 144 earthquakes subjected to analysis. A modified 4-parameter scale was found to quantitatively assess medical severity more directly, but remains flawed due to range restriction on both individual parameter and scale levels. Further research is needed in the field of disaster metrics to develop a scale that is reliable in its complete set of parameters, capable of better fine discrimination, and uni-dimensional in measurement of the medical severity of MCEs.

  8. Scaling laws and fluctuations in the statistics of word frequencies

    NASA Astrophysics Data System (ADS)

    Gerlach, Martin; Altmann, Eduardo G.

    2014-11-01

    In this paper, we combine statistical analysis of written texts and simple stochastic models to explain the appearance of scaling laws in the statistics of word frequencies. The average vocabulary of an ensemble of fixed-length texts is known to scale sublinearly with the total number of words (Heaps’ law). Analyzing the fluctuations around this average in three large databases (Google-ngram, English Wikipedia, and a collection of scientific articles), we find that the standard deviation scales linearly with the average (Taylor's law), in contrast to the prediction of decaying fluctuations obtained using simple sampling arguments. We explain both scaling laws (Heaps’ and Taylor) by modeling the usage of words using a Poisson process with a fat-tailed distribution of word frequencies (Zipf's law) and topic-dependent frequencies of individual words (as in topic models). Considering topical variations lead to quenched averages, turn the vocabulary size a non-self-averaging quantity, and explain the empirical observations. For the numerous practical applications relying on estimations of vocabulary size, our results show that uncertainties remain large even for long texts. We show how to account for these uncertainties in measurements of lexical richness of texts with different lengths.

  9. Aftershock Sequences and Seismic-Like Organization of Acoustic Events Produced by a Single Propagating Crack

    NASA Astrophysics Data System (ADS)

    Alizee, D.; Bonamy, D.

    2017-12-01

    In inhomogeneous brittle solids like rocks, concrete or ceramics, one usually distinguish nominally brittle fracture, driven by the propagation of a single crack from quasibrittle one, resulting from the accumulation of many microcracks. The latter goes along with intermittent sharp noise, as e.g. revealed by the acoustic emission observed in lab scale compressive fracture experiments or at geophysical scale in the seismic activity. In both cases, statistical analyses have revealed a complex time-energy organization into aftershock sequences obeying a range of robust empirical scaling laws (the Omori-Utsu, productivity and Bath's law) that help carry out seismic hazard analysis and damage mitigation. These laws are usually conjectured to emerge from the collective dynamics of microcrack nucleation. In the experiments presented at AGU, we will show that such a statistical organization is not specific to the quasi-brittle multicracking situations, but also rules the acoustic events produced by a single crack slowly driven in an artificial rock made of sintered polymer beads. This simpler situation has advantageous properties (statistical stationarity in particular) permitting us to uncover the origins of these seismic laws: Both productivity law and Bath's law result from the scale free statistics for event energy and Omori-Utsu law results from the scale-free statistics of inter-event time. This yields predictions on how the associated parameters are related, which were analytically derived. Surprisingly, the so-obtained relations are also compatible with observations on lab scale compressive fracture experiments, suggesting that, in these complex multicracking situations also, the organization into aftershock sequences and associated seismic laws are also ruled by the propagation of individual microcrack fronts, and not by the collective, stress-mediated, microcrack nucleation. Conversely, the relations are not fulfilled in seismology signals, suggesting that additional ingredient should be taken into account.

  10. Demonstrating microbial co-occurrence pattern analyses within and between ecosystems

    PubMed Central

    Williams, Ryan J.; Howe, Adina; Hofmockel, Kirsten S.

    2014-01-01

    Co-occurrence patterns are used in ecology to explore interactions between organisms and environmental effects on coexistence within biological communities. Analysis of co-occurrence patterns among microbial communities has ranged from simple pairwise comparisons between all community members to direct hypothesis testing between focal species. However, co-occurrence patterns are rarely studied across multiple ecosystems or multiple scales of biological organization within the same study. Here we outline an approach to produce co-occurrence analyses that are focused at three different scales: co-occurrence patterns between ecosystems at the community scale, modules of co-occurring microorganisms within communities, and co-occurring pairs within modules that are nested within microbial communities. To demonstrate our co-occurrence analysis approach, we gathered publicly available 16S rRNA amplicon datasets to compare and contrast microbial co-occurrence at different taxonomic levels across different ecosystems. We found differences in community composition and co-occurrence that reflect environmental filtering at the community scale and consistent pairwise occurrences that may be used to infer ecological traits about poorly understood microbial taxa. However, we also found that conclusions derived from applying network statistics to microbial relationships can vary depending on the taxonomic level chosen and criteria used to build co-occurrence networks. We present our statistical analysis and code for public use in analysis of co-occurrence patterns across microbial communities. PMID:25101065

  11. Field-aligned currents' scale analysis performed with the Swarm constellation

    NASA Astrophysics Data System (ADS)

    Lühr, Hermann; Park, Jaeheung; Gjerloev, Jesper W.; Rauberg, Jan; Michaelis, Ingo; Merayo, Jose M. G.; Brauer, Peter

    2015-01-01

    We present a statistical study of the temporal- and spatial-scale characteristics of different field-aligned current (FAC) types derived with the Swarm satellite formation. We divide FACs into two classes: small-scale, up to some 10 km, which are carried predominantly by kinetic Alfvén waves, and large-scale FACs with sizes of more than 150 km. For determining temporal variability we consider measurements at the same point, the orbital crossovers near the poles, but at different times. From correlation analysis we obtain a persistent period of small-scale FACs of order 10 s, while large-scale FACs can be regarded stationary for more than 60 s. For the first time we investigate the longitudinal scales. Large-scale FACs are different on dayside and nightside. On the nightside the longitudinal extension is on average 4 times the latitudinal width, while on the dayside, particularly in the cusp region, latitudinal and longitudinal scales are comparable.

  12. Predicting Slag Generation in Sub-Scale Test Motors Using a Neural Network

    NASA Technical Reports Server (NTRS)

    Wiesenberg, Brent

    1999-01-01

    Generation of slag (aluminum oxide) is an important issue for the Reusable Solid Rocket Motor (RSRM). Thiokol performed testing to quantify the relationship between raw material variations and slag generation in solid propellants by testing sub-scale motors cast with propellant containing various combinations of aluminum fuel and ammonium perchlorate (AP) oxidizer particle sizes. The test data were analyzed using statistical methods and an artificial neural network. This paper primarily addresses the neural network results with some comparisons to the statistical results. The neural network showed that the particle sizes of both the aluminum and unground AP have a measurable effect on slag generation. The neural network analysis showed that aluminum particle size is the dominant driver in slag generation, about 40% more influential than AP. The network predictions of the amount of slag produced during firing of sub-scale motors were 16% better than the predictions of a statistically derived empirical equation. Another neural network successfully characterized the slag generated during full-scale motor tests. The success is attributable to the ability of neural networks to characterize multiple complex factors including interactions that affect slag generation.

  13. Sensitivity of the Halstead and Wechsler Test Batteries to brain damage: Evidence from Reitan's original validation sample.

    PubMed

    Loring, David W; Larrabee, Glenn J

    2006-06-01

    The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.

  14. Multifractal analysis of geophysical time series in the urban lake of Créteil (France).

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Bonhomme, Celine; Schertzer, Daniel; Lemaire, Bruno Jacques; Vinçon leite, Brigitte; Lovejoy, Shaun

    2013-04-01

    Urban water bodies take part in the environmental quality of the cities. They regulate heat, contribute to the beauty of landscape and give some space for leisure activities (aquatic sports, swimming). As they are often artificial they are only a few meters deep. It confers them some specific properties. Indeed, they are particularly sensitive to global environmental changes, including climate change, eutrophication and contamination by micro-pollutants due to the urbanization of the watershed. Monitoring their quality has become a major challenge for urban areas. The need for a tool for predicting short-term proliferation of potentially toxic phytoplankton therefore arises. In lakes, the behavior of biological and physical (temperature) fields is mainly driven by the turbulence regime in the water. Turbulence is highly non linear, nonstationary and intermittent. This is why statistical tools are needed to characterize the evolution of the fields. The knowledge of the probability distribution of all the statistical moments of a given field is necessary to fully characterize it. This possibility is offered by the multifractal analysis based on the assumption of scale invariance. To investigate the effect of space-time variability of temperature, chlorophyll and dissolved oxygen on the cyanobacteria proliferation in the urban lake of Creteil (France), a spectral analysis is first performed on each time series (or on subsamples) to have an overall estimate of their scaling behaviors. Then a multifractal analysis (Trace Moment, Double Trace Moment) estimates the statistical moments of different orders. This analysis is adapted to the specific properties of the studied time series, i. e. the presence of large scale gradients. The nonlinear behavior of the scaling functions K(q) confirms that the investigated aquatic time series are indeed multifractal and highly intermittent .The knowledge of the universal multifractal parameters is the key to calculate the different statistical moments and thus make some predictions on the fields. As a conclusion, the relationships between the fields will be highlighted with a discussion on the cross predictability of the different fields. This draws a prospective for the use of this kind of time series analysis in the field of limnology. The authors acknowledge the financial support from the R2DS-PLUMMME and Climate-KIC BlueGreenDream projects.

  15. Factors Affecting the Inter-annual to Centennial Time Scale Variability of All Indian Summer Monsoon Rainfall

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan

    2016-04-01

    The All Indian Summer Monsoon Rainfall (AISMR) is highly important for the livelihood of more than 1 billion people living in the Indian sub-continent. The agriculture of this region is heavily dependent on seasonal (JJAS) monsoon rainfall. An early start or a slight delay of monsoon, or an early withdrawal or prolonged monsoon season may upset the farmer's agricultural plans, can cause significant reduction in crop yield, and hence economic loss. Understanding of AISMR is also vital because it is a part of global atmospheric circulation system. Several studies show that AISMR is influenced by internal climate forcings (ICFs) viz. ENSO, AMO, PDO etc. as well as external climate forcings (ECFs) viz. Greenhouse Gases, volcanic eruptions, and Total Solar Irradiance (TSI). We investigate the influence of ICFs and ECFs on AISMR using recently developed statistical technique called De-trended Partial-Cross-Correlation Analysis (DPCCA). DPCCA can analyse a complex system of several interlinked variables. Often, climatic variables, being cross correlated, are simultaneously tele-connected with several other variables and it is not easy to isolate their intrinsic relationship. In the presence of non-stationarities and background signals the calculated correlation coefficients can be overestimated and erroneous. DPCCA method removes the non-stationarities and partials out the influence of background signals from the variables being cross correlated and thus give a robust estimate of correlation. We have performed the analysis using NOAA Reconstructed SSTs and homogenised instrumental AISMR data set from 1854-1999. By employing the DPCCA method we find that there is a statistically insignificant negative intrinsic relation (by excluding the influence of ICFs, and ECFs except TSI) between AISMR and TSI on decadal to centennial time scale. The ICFs considerably modulate the relation between AISMR and solar activity between 50-80 year time scales and transform this relationship to statistically significant positive. We conclude that the positive relation between AISMR and solar activity, as found by other authors, is due to the combined effect of AMO, PDO and multi-decadal ENSO variability on AISMR. The solar activity influences the ICFs and this influence is then transmitted to AISMR. Further, we find that there is statistically positive intrinsic relation between AISMR and AMO from 26 to 100 year time scales which is modulated by ICFs (PDO, ENSO) and ECFs. PDO, ENSO, and solar activity weaken this intrinsic relationship whereas the combined effect of ECFc (solar activity, volcanic eruptions, CO2, & tropospheric aerosol optical depth) results in strengthening of this relationship from 70 to 100 year time scales. There is a negative intrinsic relation between AISMR and PDO which is not statistically significant at any time scale. However this relationship becomes statistically significant only in the presence of combined effect of North Atlantic SSTs and ENSO (4-39 year time scale) and individual effect of TSI (3-26 year time scale) on AISMR. We also find that there is statistical significant negative relationship between AISMR and ENSO on inter-annual to centennial time scale and the strength of this relationship is modulated by solar activity from 3 to 40 year time scale.

  16. Feature-Based Statistical Analysis of Combustion Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, J; Krishnamoorthy, V; Liu, S

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.« less

  17. DEIVA: a web application for interactive visual analysis of differential gene expression profiles.

    PubMed

    Harshbarger, Jayson; Kratz, Anton; Carninci, Piero

    2017-01-07

    Differential gene expression (DGE) analysis is a technique to identify statistically significant differences in RNA abundance for genes or arbitrary features between different biological states. The result of a DGE test is typically further analyzed using statistical software, spreadsheets or custom ad hoc algorithms. We identified a need for a web-based system to share DGE statistical test results, and locate and identify genes in DGE statistical test results with a very low barrier of entry. We have developed DEIVA, a free and open source, browser-based single page application (SPA) with a strong emphasis on being user friendly that enables locating and identifying single or multiple genes in an immediate, interactive, and intuitive manner. By design, DEIVA scales with very large numbers of users and datasets. Compared to existing software, DEIVA offers a unique combination of design decisions that enable inspection and analysis of DGE statistical test results with an emphasis on ease of use.

  18. Interactive Exploration and Analysis of Large-Scale Simulations Using Topology-Based Data Segmentation.

    PubMed

    Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B

    2011-09-01

    Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.

  19. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach basically consisted in 1- decomposing both signals (SLP field and precipitation or streamflow) using discrete wavelet multiresolution analysis and synthesis, 2- generating one statistical downscaling model per time-scale, 3- summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD ; in addition, the scale-dependent spatial patterns associated to the model matched quite well those obtained from scale-dependent composite analysis. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either prepciptation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with flood and extremely low-flow/drought periods (e.g., winter 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. Further investigations would be required to address the issue of the stationarity of the large-scale/local-scale relationships and to test the capability of the multiresolution ESD model for interannual-to-interdecadal forecasting. In terms of methodological approach, further investigations may concern a fully comprehensive sensitivity analysis of the modeling to the parameter of the multiresolution approach (different families of scaling and wavelet functions used, number of coefficients/degree of smoothness, etc.).

  20. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  1. Preliminary results from a method to update timber resource statistics in North Carolina

    Treesearch

    Glenn P. Catts; Noel D. Cost; Raymond L. Czaplewski; Paul W. Snook

    1987-01-01

    Forest Inventory and Analysis units of the USDA Forest Service produce timber resource statistics every 8 to 10 years. Midcycle surveys are often performed to update inventory estimates. This requires timely identification of forest lands. There are several kinds of remotely sensed data that are suitable for this purpose. Medium scale color infrared aerial photography...

  2. A statistical power analysis of woody carbon flux from forest inventory data

    Treesearch

    James A. Westfall; Christopher W. Woodall; Mark A. Hatfield

    2013-01-01

    At a national scale, the carbon (C) balance of numerous forest ecosystem C pools can be monitored using a stock change approach based on national forest inventory data. Given the potential influence of disturbance events and/or climate change processes, the statistical detection of changes in forest C stocks is paramount to maintaining the net sequestration status of...

  3. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation ofmore » the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.« less

  4. A new ionospheric storm scale based on TEC and foF2 statistics

    NASA Astrophysics Data System (ADS)

    Nishioka, Michi; Tsugawa, Takuya; Jin, Hidekatsu; Ishii, Mamoru

    2017-01-01

    In this paper, we propose the I-scale, a new ionospheric storm scale for general users in various regions in the world. With the I-scale, ionospheric storms can be classified at any season, local time, and location. Since the ionospheric condition largely depends on many factors such as solar irradiance, energy input from the magnetosphere, and lower atmospheric activity, it had been difficult to scale ionospheric storms, which are mainly caused by solar and geomagnetic activities. In this study, statistical analysis was carried out for total electron content (TEC) and F2 layer critical frequency (foF2) in Japan for 18 years from 1997 to 2014. Seasonal, local time, and latitudinal dependences of TEC and foF2 variabilities are excluded by normalizing each percentage variation using their statistical standard deviations. The I-scale is defined by setting thresholds to the normalized numbers to seven categories: I0, IP1, IP2, IP3, IN1, IN2, and IN3. I0 represents a quiet state, and IP1 (IN1), IP2 (IN2), and IP3 (IN3) represent moderate, strong, and severe positive (negative) storms, respectively. The proposed I-scale can be used for other locations, such as polar and equatorial regions. It is considered that the proposed I-scale can be a standardized scale to help the users to assess the impact of space weather on their systems.

  5. Analysis of the Empathic Concern Subscale of the Emotional Response Questionnaire in a Study Evaluating the Impact of a 3D Cultural Simulation.

    PubMed

    Everson, Naleya; Levett-Jones, Tracy; Pitt, Victoria; Lapkin, Samuel; Van Der Riet, Pamela; Rossiter, Rachel; Jones, Donovan; Gilligan, Conor; Courtney Pratt, Helen

    2018-04-25

    Abstract Background Empathic concern has been found to decline in health professional students. Few effective educational programs and a lack of validated scales are reported. Previous analysis of the Empathic Concern scale of the Emotional Response Questionnaire has reported both one and two latent constructs. Aim To evaluate the impact of simulation on nursing students' empathic concern and test the psychometric properties of the Empathic Concern scale. Methods The study used a one group pre-test post-test design with a convenience sample of 460 nursing students. Empathic concern was measured pre-post simulation with the Empathic Concern scale. Factor Analysis was undertaken to investigate the structure of the scale. Results There was a statistically significant increase in Empathic Concern scores between pre-simulation 5.57 (SD = 1.04) and post-simulation 6.10 (SD = 0.95). Factor analysis of the Empathic Concern scale identified one latent dimension. Conclusion Immersive simulation may promote empathic concern. The Empathic Concern scale measured a single latent construct in this cohort.

  6. Bayesian inference for joint modelling of longitudinal continuous, binary and ordinal events.

    PubMed

    Li, Qiuju; Pan, Jianxin; Belcher, John

    2016-12-01

    In medical studies, repeated measurements of continuous, binary and ordinal outcomes are routinely collected from the same patient. Instead of modelling each outcome separately, in this study we propose to jointly model the trivariate longitudinal responses, so as to take account of the inherent association between the different outcomes and thus improve statistical inferences. This work is motivated by a large cohort study in the North West of England, involving trivariate responses from each patient: Body Mass Index, Depression (Yes/No) ascertained with cut-off score not less than 8 at the Hospital Anxiety and Depression Scale, and Pain Interference generated from the Medical Outcomes Study 36-item short-form health survey with values returned on an ordinal scale 1-5. There are some well-established methods for combined continuous and binary, or even continuous and ordinal responses, but little work was done on the joint analysis of continuous, binary and ordinal responses. We propose conditional joint random-effects models, which take into account the inherent association between the continuous, binary and ordinal outcomes. Bayesian analysis methods are used to make statistical inferences. Simulation studies show that, by jointly modelling the trivariate outcomes, standard deviations of the estimates of parameters in the models are smaller and much more stable, leading to more efficient parameter estimates and reliable statistical inferences. In the real data analysis, the proposed joint analysis yields a much smaller deviance information criterion value than the separate analysis, and shows other good statistical properties too. © The Author(s) 2014.

  7. Interpreting statistics of small lunar craters

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.; Gault, D.; Greeley, R.

    1977-01-01

    Some of the wide variations in the crater-size distributions in lunar photography and in the resulting statistics were interpreted as different degradation rates on different surfaces, different scaling laws in different targets, and a possible population of endogenic craters. These possibilities are reexamined for statistics of 26 different regions. In contrast to most other studies, crater diameters as small as 5 m were measured from enlarged Lunar Orbiter framelets. According to the results of the reported analysis, the different crater distribution types appear to be most consistent with the hypotheses of differential degradation and a superposed crater population. Differential degradation can account for the low level of equilibrium in incompetent materials such as ejecta deposits, mantle deposits, and deep regoliths where scaling law changes and catastrophic processes introduce contradictions with other observations.

  8. Statistical considerations in the development of injury risk functions.

    PubMed

    McMurry, Timothy L; Poplin, Gerald S

    2015-01-01

    We address 4 frequently misunderstood and important statistical ideas in the construction of injury risk functions. These include the similarities of survival analysis and logistic regression, the correct scale on which to construct pointwise confidence intervals for injury risk, the ability to discern which form of injury risk function is optimal, and the handling of repeated tests on the same subject. The statistical models are explored through simulation and examination of the underlying mathematics. We provide recommendations for the statistically valid construction and correct interpretation of single-predictor injury risk functions. This article aims to provide useful and understandable statistical guidance to improve the practice in constructing injury risk functions.

  9. Statistical analysis of catalogs of extragalactic objects. II - The Abell catalog of rich clusters

    NASA Technical Reports Server (NTRS)

    Hauser, M. G.; Peebles, P. J. E.

    1973-01-01

    The results of a power-spectrum analysis are presented for the distribution of clusters in the Abell catalog. Clear and direct evidence is found for superclusters with small angular scale, in agreement with the recent study of Bogart and Wagoner (1973). It is also found that the degree and angular scale of the apparent superclustering varies with distance in the manner expected if the clustering is intrinsic to the spatial distribution rather than a consequence of patchy local obscuration.

  10. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    PubMed

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  11. Comparison of direct numerical simulation databases of turbulent channel flow at Reτ = 180

    NASA Astrophysics Data System (ADS)

    Vreman, A. W.; Kuerten, J. G. M.

    2014-01-01

    Direct numerical simulation (DNS) databases are compared to assess the accuracy and reproducibility of standard and non-standard turbulence statistics of incompressible plane channel flow at Reτ = 180. Two fundamentally different DNS codes are shown to produce maximum relative deviations below 0.2% for the mean flow, below 1% for the root-mean-square velocity and pressure fluctuations, and below 2% for the three components of the turbulent dissipation. Relatively fine grids and long statistical averaging times are required. An analysis of dissipation spectra demonstrates that the enhanced resolution is necessary for an accurate representation of the smallest physical scales in the turbulent dissipation. The results are related to the physics of turbulent channel flow in several ways. First, the reproducibility supports the hitherto unproven theoretical hypothesis that the statistically stationary state of turbulent channel flow is unique. Second, the peaks of dissipation spectra provide information on length scales of the small-scale turbulence. Third, the computed means and fluctuations of the convective, pressure, and viscous terms in the momentum equation show the importance of the different forces in the momentum equation relative to each other. The Galilean transformation that leads to minimum peak fluctuation of the convective term is determined. Fourth, an analysis of higher-order statistics is performed. The skewness of the longitudinal derivative of the streamwise velocity is stronger than expected (-1.5 at y+ = 30). This skewness and also the strong near-wall intermittency of the normal velocity are related to coherent structures.

  12. Large-scale gene function analysis with the PANTHER classification system.

    PubMed

    Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D

    2013-08-01

    The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.

  13. Rasch analysis suggested three unidimensional domains for Affiliate Stigma Scale: additional psychometric evaluation.

    PubMed

    Chang, Chih-Cheng; Su, Jian-An; Tsai, Ching-Shu; Yen, Cheng-Fang; Liu, Jiun-Horng; Lin, Chung-Ying

    2015-06-01

    To examine the psychometrics of the Affiliate Stigma Scale using rigorous psychometric analysis: classical test theory (CTT) (traditional) and Rasch analysis (modern). Differential item functioning (DIF) items were also tested using Rasch analysis. Caregivers of relatives with mental illness (n = 453; mean age: 53.29 ± 13.50 years) were recruited from southern Taiwan. Each participant filled out four questionnaires: Affiliate Stigma Scale, Rosenberg Self-Esteem Scale, Beck Anxiety Inventory, and one background information sheet. CTT analyses showed that the Affiliate Stigma Scale had satisfactory internal consistency (α = 0.85-0.94) and concurrent validity (Rosenberg Self-Esteem Scale: r = -0.52 to -0.46; Beck Anxiety Inventory: r = 0.27-0.34). Rasch analyses supported the unidimensionality of three domains in the Affiliate Stigma Scale and indicated four DIF items (affect domain: 1; cognitive domain: 3) across gender. Our findings, based on rigorous statistical analysis, verified the psychometrics of the Affiliate Stigma Scale and reported its DIF items. We conclude that the three domains of the Affiliate Stigma Scale can be separately used and are suitable for measuring the affiliate stigma of caregivers of relatives with mental illness. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Rasch Based Analysis of Oral Proficiency Test Data.

    ERIC Educational Resources Information Center

    Nakamura, Yuji

    2001-01-01

    This paper examines the rating scale data of oral proficiency tests analyzed by a Rasch Analysis focusing on an item map and factor analysis. In discussing the item map, the difficulty order of six items and students' answering patterns are analyzed using descriptive statistics and measures of central tendency of test scores. The data ranks the…

  15. ANALYSIS TO ACCOUNT FOR SMALL AGE RANGE CATEGORIES IN DISTRIBUTIONS OF WATER CONSUMPTION AND BODY WEIGHT IN THE U.S. USING CSFII DATA

    EPA Science Inventory

    Statistical population based estimates of water ingestion play a vital role in many types of exposure and risk analysis. A significant large scale analysis of water ingestion by the population of the United States was recently completed and is documented in the report titled ...

  16. Robustness of Type I Error and Power in Set Correlation Analysis of Contingency Tables.

    ERIC Educational Resources Information Center

    Cohen, Jacob; Nee, John C. M.

    1990-01-01

    The analysis of contingency tables via set correlation allows the assessment of subhypotheses involving contrast functions of the categories of the nominal scales. The robustness of such methods with regard to Type I error and statistical power was studied via a Monte Carlo experiment. (TJH)

  17. Dependence of exponents on text length versus finite-size scaling for word-frequency distributions

    NASA Astrophysics Data System (ADS)

    Corral, Álvaro; Font-Clos, Francesc

    2017-08-01

    Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.

  18. Scaling of global input-output networks

    NASA Astrophysics Data System (ADS)

    Liang, Sai; Qi, Zhengling; Qu, Shen; Zhu, Ji; Chiu, Anthony S. F.; Jia, Xiaoping; Xu, Ming

    2016-06-01

    Examining scaling patterns of networks can help understand how structural features relate to the behavior of the networks. Input-output networks consist of industries as nodes and inter-industrial exchanges of products as links. Previous studies consider limited measures for node strengths and link weights, and also ignore the impact of dataset choice. We consider a comprehensive set of indicators in this study that are important in economic analysis, and also examine the impact of dataset choice, by studying input-output networks in individual countries and the entire world. Results show that Burr, Log-Logistic, Log-normal, and Weibull distributions can better describe scaling patterns of global input-output networks. We also find that dataset choice has limited impacts on the observed scaling patterns. Our findings can help examine the quality of economic statistics, estimate missing data in economic statistics, and identify key nodes and links in input-output networks to support economic policymaking.

  19. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  20. A meta-analysis and statistical modelling of nitrates in groundwater at the African scale

    NASA Astrophysics Data System (ADS)

    Ouedraogo, Issoufou; Vanclooster, Marnik

    2016-06-01

    Contamination of groundwater with nitrate poses a major health risk to millions of people around Africa. Assessing the space-time distribution of this contamination, as well as understanding the factors that explain this contamination, is important for managing sustainable drinking water at the regional scale. This study aims to assess the variables that contribute to nitrate pollution in groundwater at the African scale by statistical modelling. We compiled a literature database of nitrate concentration in groundwater (around 250 studies) and combined it with digital maps of physical attributes such as soil, geology, climate, hydrogeology, and anthropogenic data for statistical model development. The maximum, medium, and minimum observed nitrate concentrations were analysed. In total, 13 explanatory variables were screened to explain observed nitrate pollution in groundwater. For the mean nitrate concentration, four variables are retained in the statistical explanatory model: (1) depth to groundwater (shallow groundwater, typically < 50 m); (2) recharge rate; (3) aquifer type; and (4) population density. The first three variables represent intrinsic vulnerability of groundwater systems to pollution, while the latter variable is a proxy for anthropogenic pollution pressure. The model explains 65 % of the variation of mean nitrate contamination in groundwater at the African scale. Using the same proxy information, we could develop a statistical model for the maximum nitrate concentrations that explains 42 % of the nitrate variation. For the maximum concentrations, other environmental attributes such as soil type, slope, rainfall, climate class, and region type improve the prediction of maximum nitrate concentrations at the African scale. As to minimal nitrate concentrations, in the absence of normal distribution assumptions of the data set, we do not develop a statistical model for these data. The data-based statistical model presented here represents an important step towards developing tools that will allow us to accurately predict nitrate distribution at the African scale and thus may support groundwater monitoring and water management that aims to protect groundwater systems. Yet they should be further refined and validated when more detailed and harmonized data become available and/or combined with more conceptual descriptions of the fate of nutrients in the hydrosystem.

  1. Quantifying predictability in a model with statistical features of the atmosphere

    PubMed Central

    Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya

    2002-01-01

    The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863

  2. Tract-Based Spatial Statistics in Preterm-Born Neonates Predicts Cognitive and Motor Outcomes at 18 Months.

    PubMed

    Duerden, E G; Foong, J; Chau, V; Branson, H; Poskitt, K J; Grunau, R E; Synnes, A; Zwicker, J G; Miller, S P

    2015-08-01

    Adverse neurodevelopmental outcome is common in children born preterm. Early sensitive predictors of neurodevelopmental outcome such as MR imaging are needed. Tract-based spatial statistics, a diffusion MR imaging analysis method, performed at term-equivalent age (40 weeks) is a promising predictor of neurodevelopmental outcomes in children born very preterm. We sought to determine the association of tract-based spatial statistics findings before term-equivalent age with neurodevelopmental outcome at 18-months corrected age. Of 180 neonates (born at 24-32-weeks' gestation) enrolled, 153 had DTI acquired early at 32 weeks' postmenstrual age and 105 had DTI acquired later at 39.6 weeks' postmenstrual age. Voxelwise statistics were calculated by performing tract-based spatial statistics on DTI that was aligned to age-appropriate templates. At 18-month corrected age, 166 neonates underwent neurodevelopmental assessment by using the Bayley Scales of Infant Development, 3rd ed, and the Peabody Developmental Motor Scales, 2nd ed. Tract-based spatial statistics analysis applied to early-acquired scans (postmenstrual age of 30-33 weeks) indicated a limited significant positive association between motor skills and axial diffusivity and radial diffusivity values in the corpus callosum, internal and external/extreme capsules, and midbrain (P < .05, corrected). In contrast, for term scans (postmenstrual age of 37-41 weeks), tract-based spatial statistics analysis showed a significant relationship between both motor and cognitive scores with fractional anisotropy in the corpus callosum and corticospinal tracts (P < .05, corrected). Tract-based spatial statistics in a limited subset of neonates (n = 22) scanned at <30 weeks did not significantly predict neurodevelopmental outcomes. The strength of the association between fractional anisotropy values and neurodevelopmental outcome scores increased from early-to-late-acquired scans in preterm-born neonates, consistent with brain dysmaturation in this population. © 2015 by American Journal of Neuroradiology.

  3. An analysis of the cognitive deficit of schizophrenia based on the Piaget developmental theory.

    PubMed

    Torres, Alejandro; Olivares, Jose M; Rodriguez, Angel; Vaamonde, Antonio; Berrios, German E

    2007-01-01

    The objective of the study was to evaluate from the perspective of the Piaget developmental model the cognitive functioning of a sample of patients diagnosed with schizophrenia. Fifty patients with schizophrenia (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) and 40 healthy matched controls were evaluated by means of the Longeot Logical Thought Evaluation Scale. Only 6% of the subjects with schizophrenia reached the "formal period," and 70% remained at the "concrete operations" stage. The corresponding figures for the control sample were 25% and 15%, respectively. These differences were statistically significant. The samples were specifically differentiable on the permutation, probabilities, and pendulum tests of the scale. The Longeot Logical Thought Evaluation Scale can discriminate between subjects with schizophrenia and healthy controls.

  4. The Chinese version of the Outcome Expectations for Exercise scale: validation study.

    PubMed

    Lee, Ling-Ling; Chiu, Yu-Yun; Ho, Chin-Chih; Wu, Shu-Chen; Watson, Roger

    2011-06-01

    Estimates of the reliability and validity of the English nine-item Outcome Expectations for Exercise (OEE) scale have been tested and found to be valid for use in various settings, particularly among older people, with good internal consistency and validity. Data on the use of the OEE scale among older Chinese people living in the community and how cultural differences might affect the administration of the OEE scale are limited. To test the validity and reliability of the Chinese version of the Outcome Expectations for Exercise scale among older people. A cross-sectional validation study was designed to test the Chinese version of the OEE scale (OEE-C). Reliability was examined by testing both the internal consistency for the overall scale and the squared multiple correlation coefficient for the single item measure. The validity of the scale was tested on the basis of both a traditional psychometric test and a confirmatory factor analysis using structural equation modelling. The Mokken Scaling Procedure (MSP) was used to investigate if there were any hierarchical, cumulative sets of items in the measure. The OEE-C scale was tested in a group of older people in Taiwan (n=108, mean age=77.1). There was acceptable internal consistency (alpha=.85) and model fit in the scale. Evidence of the validity of the measure was demonstrated by the tests for criterion-related validity and construct validity. There was a statistically significant correlation between exercise outcome expectations and exercise self-efficacy (r=.34, p<.01). An analysis of the Mokken Scaling Procedure found that nine items of the scale were all retained in the analysis and the resulting scale was reliable and statistically significant (p=.0008). The results obtained in the present study provided acceptable levels of reliability and validity evidence for the Chinese Outcome Expectations for Exercise scale when used with older people in Taiwan. Future testing of the OEE-C scale needs to be carried out to see whether these results are generalisable to older Chinese people living in urban areas. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Interdisciplinary evaluation of dysphagia: clinical swallowing evaluation and videoendoscopy of swallowing.

    PubMed

    Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite

    2009-01-01

    Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.

  6. Seeking a fingerprint: analysis of point processes in actigraphy recording

    NASA Astrophysics Data System (ADS)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  7. Characteristics of At-Risk Students in NELS:88. National Education Longitudinal Study of 1988. Statistical Analysis Report. Contractor Report. NCES 92-042

    ERIC Educational Resources Information Center

    Kaufman, Phillip; Bradbury, Denise

    1992-01-01

    The National Education Longitudinal Study of 1988 (NELS:88) is a large-scale, national longitudinal study designed and sponsored by the National Center for Education Statistics (NCES), with support from other government agencies. Beginning in the spring of 1988 with a cohort of eighth graders (25,000) attending public and private schools across…

  8. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  9. The Genomic HyperBrowser: an analysis web server for genome-scale data

    PubMed Central

    Sandve, Geir K.; Gundersen, Sveinung; Johansen, Morten; Glad, Ingrid K.; Gunathasan, Krishanthi; Holden, Lars; Holden, Marit; Liestøl, Knut; Nygård, Ståle; Nygaard, Vegard; Paulsen, Jonas; Rydbeck, Halfdan; Trengereid, Kai; Clancy, Trevor; Drabløs, Finn; Ferkingstad, Egil; Kalaš, Matúš; Lien, Tonje; Rye, Morten B.; Frigessi, Arnoldo; Hovig, Eivind

    2013-01-01

    The immense increase in availability of genomic scale datasets, such as those provided by the ENCODE and Roadmap Epigenomics projects, presents unprecedented opportunities for individual researchers to pose novel falsifiable biological questions. With this opportunity, however, researchers are faced with the challenge of how to best analyze and interpret their genome-scale datasets. A powerful way of representing genome-scale data is as feature-specific coordinates relative to reference genome assemblies, i.e. as genomic tracks. The Genomic HyperBrowser (http://hyperbrowser.uio.no) is an open-ended web server for the analysis of genomic track data. Through the provision of several highly customizable components for processing and statistical analysis of genomic tracks, the HyperBrowser opens for a range of genomic investigations, related to, e.g., gene regulation, disease association or epigenetic modifications of the genome. PMID:23632163

  10. The Genomic HyperBrowser: an analysis web server for genome-scale data.

    PubMed

    Sandve, Geir K; Gundersen, Sveinung; Johansen, Morten; Glad, Ingrid K; Gunathasan, Krishanthi; Holden, Lars; Holden, Marit; Liestøl, Knut; Nygård, Ståle; Nygaard, Vegard; Paulsen, Jonas; Rydbeck, Halfdan; Trengereid, Kai; Clancy, Trevor; Drabløs, Finn; Ferkingstad, Egil; Kalas, Matús; Lien, Tonje; Rye, Morten B; Frigessi, Arnoldo; Hovig, Eivind

    2013-07-01

    The immense increase in availability of genomic scale datasets, such as those provided by the ENCODE and Roadmap Epigenomics projects, presents unprecedented opportunities for individual researchers to pose novel falsifiable biological questions. With this opportunity, however, researchers are faced with the challenge of how to best analyze and interpret their genome-scale datasets. A powerful way of representing genome-scale data is as feature-specific coordinates relative to reference genome assemblies, i.e. as genomic tracks. The Genomic HyperBrowser (http://hyperbrowser.uio.no) is an open-ended web server for the analysis of genomic track data. Through the provision of several highly customizable components for processing and statistical analysis of genomic tracks, the HyperBrowser opens for a range of genomic investigations, related to, e.g., gene regulation, disease association or epigenetic modifications of the genome.

  11. Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.

    PubMed

    Satorra, Albert; Bentler, Peter M

    2010-06-01

    A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.

  12. Environmental Studies: Mathematical, Computational and Statistical Analyses

    DTIC Science & Technology

    1993-03-03

    mathematical analysis addresses the seasonally and longitudinally averaged circulation which is under the influence of a steady forcing located asymmetrically...employed, as has been suggested for some situations. A general discussion of how interfacial phenomena influence both the original contamination process...describing the large-scale advective and dispersive behaviour of contaminants transported by groundwater and the uncertainty associated with field-scale

  13. On the self-preservation of turbulent jet flows with variable viscosity

    NASA Astrophysics Data System (ADS)

    Danaila, Luminita; Gauding, Michael; Varea, Emilien; Turbulence; mixing Team

    2017-11-01

    The concept of self-preservation has played an important role in shaping the understanding of turbulent flows. The assumption of complete self-preservation imposes certain constrains on the dynamics of the flow, allowing to express one-point or two-point statistics by choosing an appropriate unique length scale. Determining this length scale and its scaling is of high relevance for modeling. In this work, we study turbulent jet flows with variable viscosity from the self-preservation perspective. Turbulent flows encountered in engineering and environmental applications are often characterized by fluctuations of viscosity resulting for instance from variations of temperature or species composition. Starting from the transport equation for the moments of the mixture fraction increment, constraints for self-preservation are derived. The analysis is based on direct numerical simulations of turbulent jet flows where the viscosity between host and jet fluid differs. It is shown that fluctuations of viscosity do not affect the decay exponents of the turbulent energy or the dissipation but modify the scaling of two-point statistics in the dissipative range. Moreover, the analysis reveals that complete self-preservation in turbulent flows with variable viscosity cannot be achieved. Financial support from Labex EMC3 and FEDER is gratefully acknowledged.

  14. A space-time multifractal analysis on radar rainfall sequences from central Poland

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Deidda, Roberto

    2014-05-01

    Rainfall downscaling belongs to most important tasks of modern hydrology. Especially from the perspective of urban hydrology there is real need for development of practical tools for possible rainfall scenarios generation. Rainfall scenarios of fine temporal scale reaching single minutes are indispensable as inputs for hydrological models. Assumption of probabilistic philosophy of drainage systems design and functioning leads to widespread application of hydrodynamic models in engineering practice. However models like these covering large areas could not be supplied with only uncorrelated point-rainfall time series. They should be rather supplied with space time rainfall scenarios displaying statistical properties of local natural rainfall fields. Implementation of a Space-Time Rainfall (STRAIN) model for hydrometeorological applications in Polish conditions, such as rainfall downscaling from the large scales of meteorological models to the scale of interest for rainfall-runoff processes is the long-distance aim of our research. As an introduction part of our study we verify the veracity of the following STRAIN model assumptions: rainfall fields are isotropic and statistically homogeneous in space; self-similarity holds (so that, after having rescaled the time by the advection velocity, rainfall is a fully homogeneous and isotropic process in the space-time domain); statistical properties of rainfall are characterized by an "a priori" known multifractal behavior. We conduct a space-time multifractal analysis on radar rainfall sequences selected from the Polish national radar system POLRAD. Radar rainfall sequences covering the area of 256 km x 256 km of original 2 km x 2 km spatial resolution and 15 minutes temporal resolution are used as study material. Attention is mainly focused on most severe summer convective rainfalls. It is shown that space-time rainfall can be considered with a good approximation to be a self-similar multifractal process. Multifractal analysis is carried out assuming Taylor's hypothesis to hold and the advection velocity needed to rescale the time dimension is assumed to be equal about 16 km/h. This assumption is verified by the analysis of autocorrelation functions along the x and y directions of "rainfall cubes" and along the time axis rescaled with assumed advection velocity. In general for analyzed rainfall sequences scaling is observed for spatial scales ranging from 4 to 256 km and for timescales from 15 min to 16 hours. However in most cases scaling break is identified for spatial scales between 4 and 8, corresponding to spatial dimensions of 16 km to 32 km. It is assumed that the scaling break occurrence at these particular scales in central Poland conditions could be at least partly explained by the rainfall mesoscale gap (on the edge of meso-gamma, storm-scale and meso-beta scale).

  15. A study on phenomenology of Dhat syndrome in men in a general medical setting.

    PubMed

    Prakash, Sathya; Sharan, Pratap; Sood, Mamta

    2016-01-01

    "Dhat syndrome" is believed to be a culture-bound syndrome of the Indian subcontinent. Although many studies have been performed, many have methodological limitations and there is a lack of agreement in many areas. The aim is to study the phenomenology of "Dhat syndrome" in men and to explore the possibility of subtypes within this entity. It is a cross-sectional descriptive study conducted at a sex and marriage counseling clinic of a tertiary care teaching hospital in Northern India. An operational definition and assessment instrument for "Dhat syndrome" was developed after taking all concerned stakeholders into account and review of literature. It was applied on 100 patients along with socio-demographic profile, Hamilton Depression Rating Scale, Hamilton Anxiety Rating Scale, Mini International Neuropsychiatric Interview, and Postgraduate Institute Neuroticism Scale. For statistical analysis, descriptive statistics, group comparisons, and Pearson's product moment correlations were carried out. Factor analysis and cluster analysis were done to determine the factor structure and subtypes of "Dhat syndrome." A diagnostic and assessment instrument for "Dhat syndrome" has been developed and the phenomenology in 100 patients has been described. Both the health beliefs scale and associated symptoms scale demonstrated a three-factor structure. The patients with "Dhat syndrome" could be categorized into three clusters based on severity. There appears to be a significant agreement among various stakeholders on the phenomenology of "Dhat syndrome" although some differences exist. "Dhat syndrome" could be subtyped into three clusters based on severity.

  16. Statistical downscaling of GCM simulations to streamflow using relevance vector machine

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Mujumdar, P. P.

    2008-01-01

    General circulation models (GCMs), the climate models often used in assessing the impact of climate change, operate on a coarse scale and thus the simulation results obtained from GCMs are not particularly useful in a comparatively smaller river basin scale hydrology. The article presents a methodology of statistical downscaling based on sparse Bayesian learning and Relevance Vector Machine (RVM) to model streamflow at river basin scale for monsoon period (June, July, August, September) using GCM simulated climatic variables. NCEP/NCAR reanalysis data have been used for training the model to establish a statistical relationship between streamflow and climatic variables. The relationship thus obtained is used to project the future streamflow from GCM simulations. The statistical methodology involves principal component analysis, fuzzy clustering and RVM. Different kernel functions are used for comparison purpose. The model is applied to Mahanadi river basin in India. The results obtained using RVM are compared with those of state-of-the-art Support Vector Machine (SVM) to present the advantages of RVMs over SVMs. A decreasing trend is observed for monsoon streamflow of Mahanadi due to high surface warming in future, with the CCSR/NIES GCM and B2 scenario.

  17. Impact of Rating Scale Categories on Reliability and Fit Statistics of the Malay Spiritual Well-Being Scale using Rasch Analysis.

    PubMed

    Daher, Aqil Mohammad; Ahmad, Syed Hassan; Winn, Than; Selamat, Mohd Ikhsan

    2015-01-01

    Few studies have employed the item response theory in examining reliability. We conducted this study to examine the effect of Rating Scale Categories (RSCs) on the reliability and fit statistics of the Malay Spiritual Well-Being Scale, employing the Rasch model. The Malay Spiritual Well-Being Scale (SWBS) with the original six; three and four newly structured RSCs was distributed randomly among three different samples of 50 participants each. The mean age of respondents in the three samples ranged between 36 and 39 years old. The majority was female in all samples, and Islam was the most prevalent religion among the respondents. The predominating race was Malay, followed by Chinese and Indian. The original six RSCs indicated better targeting of 0.99 and smallest model error of 0.24. The Infit Mnsq (mean square) and Zstd (Z standard) of the six RSCs were "1.1"and "-0.1"respectively. The six RSCs achieved the highest person and item reliabilities of 0.86 and 0.85 respectively. These reliabilities yielded the highest person (2.46) and item (2.38) separation indices compared to other the RSCs. The person and item reliability and, to a lesser extent, the fit statistics, were better with the six RSCs compared to the four and three RSCs.

  18. Practical statistics in pain research.

    PubMed

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  19. Study on the Measurement and Calculation of Environmental Pollution Bearing Index of China’s Pig Scale

    NASA Astrophysics Data System (ADS)

    Leng, Bi-Bin; Gong, Jian; Zhang, Wen-bo; Ji, Xue-Qiang

    2017-11-01

    According to the environmental pollution caused by large-scale pig breeding, the SPSS statistical software and factor analysis method were used to calculate the environmental pollution bearing index of China’s breeding scale from 2006 to 2015. The results showed that with the increase of scale the density of live pig farming and the amount of fertilizer application in agricultural production increased. However, due to the improvement of national environmental awareness, industrial waste water discharge is greatly reduced. China's hog farming environmental pollution load index is rising.

  20. Macro scale models for freight railroad terminals.

    DOT National Transportation Integrated Search

    2016-03-02

    The project has developed a yard capacity model for macro-level analysis. The study considers the detailed sequence and scheduling in classification yards and their impacts on yard capacities simulate typical freight railroad terminals, and statistic...

  1. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was computed over multiple scales. This slope analysis showed that local slope distributions are non-Gaussian for both crater walls and floors. Over larger baselines (~100 meters), crater wall slope probability distributions do approximate Gaussian distributions better, but have long distribution tails. Crater floor probability distributions however, were always asymmetric (for the baseline scales analyzed) and less affected by baseline scale variations. Accordingly, our results suggest that use of long tailed probability distributions (like Cauchy) and a baseline-dependant multi-scale model can be more effective in describing the slope statistics for lunar topography. Refrences: [1]Moore, H.(1971), JGR,75(11) [2]Marcus, A. H.(1969),JGR,74 (22).[3]R.J. Pike (1970),U.S. Geological Survey Working Paper [4]N. C. Costes, J. E. Farmer and E. B. George (1972),NASA Technical Report TR R-401 [5]M. N. Parker and G. L. Tyler(1973), Radio Science, 8(3),177-184 [6]Alekseev, V. A.et al (1968), Soviet Astronomy, Vol. 11, p.860 [7]Burns et al. (2012) Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B4, 483-488.[8]Smith et al. (2010) GRL 37, L18204, DOI: 10.1029/2010GL043751. [9]Wagner R., Robinson, M., Speyerer E., Mahanti, P., LPSC 2013, #2924.

  2. Psychometric Validation of the Malaysian Chinese Version of the EORTC QLQ-C30 in Colorectal Cancer Patients.

    PubMed

    Magaji, Bello Arkilla; Moy, Foong Ming; Roslani, April Camilla; Law, Chee Wei; Sagap, Ismail

    2015-01-01

    Colorectal cancer is the second most frequent cancer in Malaysia. We aimed to assess the validity and reliability of the Malaysian Chinese version of European Organization for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire core (QLQ-C30) in patients with colorectal cancer. Translated versions of the QLQ-C30 were obtained from the EORTC. A cross sectional study design was used to obtain data from patients receiving treatment at two teaching hospitals in Kuala Lumpur, Malaysia. The Malaysian Chinese version of QLQ-C30 was self-administered in 96 patients while the Karnofsky Performance Scales (KPS) was generated by attending surgeons. Statistical analysis included reliability, convergent, discriminate validity, and known-groups comparisons. Statistical significance was based on p value ≤0.05. The internal consistencies of the Malaysian Chinese version were acceptable [Cronbach's alpha (α≥ 0.70)] in the global health status/overall quality of life (GHS/QOL), functioning scales except cognitive scale (α≤0.32) in all levels of analysis, and social/family functioning scale (α=0.63) in patients without a stoma. All questionnaire items fulfilled the criteria for convergent and discriminant validity except question number 5, with correlation with role (r = 0.62) and social/family (r = 0.41) functioning higher than with physical functioning scales (r = 0.34). The test-retest coefficients in the GHS/QOL, functioning scales and in most of the symptoms scales were moderate to high (r = 0.58 to 1.00). Patients with a stoma reported statistically significant lower physical functioning (p=0.015), social/family functioning (p=0.013), and higher constipation (p=0.010) and financial difficulty (p=0.037) compared to patients without stoma. There was no significant difference between patients with high and low KPS scores. Malaysian Chinese version of the QLQ-C30 is a valid and reliable measure of HRQOL in patients with colorectal cancer.

  3. Use of the Oxford Handicap Scale at hospital discharge to predict Glasgow Outcome Scale at 6 months in patients with traumatic brain injury.

    PubMed

    Perel, Pablo; Edwards, Phil; Shakur, Haleema; Roberts, Ian

    2008-11-06

    Traumatic brain injury (TBI) is an important cause of acquired disability. In evaluating the effectiveness of clinical interventions for TBI it is important to measure disability accurately. The Glasgow Outcome Scale (GOS) is the most widely used outcome measure in randomised controlled trials (RCTs) in TBI patients. However GOS measurement is generally collected at 6 months after discharge when loss to follow up could have occurred. The objectives of this study were to evaluate the association and predictive validity between a simple disability scale at hospital discharge, the Oxford Handicap Scale (OHS), and the GOS at 6 months among TBI patients. The study was a secondary analysis of a randomised clinical trial among TBI patients (MRC CRASH Trial). A Spearman correlation was estimated to evaluate the association between the OHS and GOS. The validity of different dichotomies of the OHS for predicting GOS at 6 months was assessed by calculating sensitivity, specificity and the C statistic. Uni and multivariate logistic regression models were fitted including OHS as explanatory variable. For each model we analysed its discrimination and calibration. We found that the OHS is highly correlated with GOS at 6 months (spearman correlation 0.75) with evidence of a linear relationship between the two scales. The OHS dichotomy that separates patients with severe dependency or death showed the greatest discrimination (C statistic: 84.3). Among survivors at hospital discharge the OHS showed a very good discrimination (C statistic 0.78) and excellent calibration when used to predict GOS outcome at 6 months. We have shown that the OHS, a simple disability scale available at hospital discharge can predict disability accurately, according to the GOS, at 6 months. OHS could be used to improve the design and analysis of clinical trials in TBI patients and may also provide a valuable clinical tool for physicians to improve communication with patients and relatives when assessing a patient's prognosis at hospital discharge.

  4. A quadratically regularized functional canonical correlation analysis for identifying the global structure of pleiotropy with NGS data

    PubMed Central

    Zhu, Yun; Fan, Ruzong; Xiong, Momiao

    2017-01-01

    Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274

  5. Videodermoscopy does not enhance diagnosis of scalp contact dermatitis due to topical minoxidil.

    PubMed

    Tosti, Antonella; Donati, Aline; Vincenzi, Colombina; Fabbrocini, Gabriella

    2009-07-01

    Videodermoscopy (VD) is a noninvasive diagnostic tool that provides useful information for the differential diagnosis of scalp disorders. The aim of this study was to investigate if dermoscopy may help the clinician in the diagnosis of contact dermatitis of the scalp. We analyzed the dermoscopic images taken from 7 patients with contact dermatitis due to topical minoxidil, 6 patients complaining of intense scalp itching during treatment with topical minoxidil but with negative patch tests and 19 controls. The following dermoscopic patterns described for scalp diseases were evaluated: Vascular patterns (simple loops, twisted loops and arborizing lines), follicular/perifollicular patterns (yellow dots, empty ostia, white dots, peripilar signs), white scales, yellow scales, follicular plugging, hair diameter diversity, honeycomb pattern and short regrowing hairs. Findings were graded from 0-4, according to severity in 20-fold magnifications. Statistical analysis included univariate analysis and Chi-square test by SPSS version 12. There were no statistical differences in the analysis of the vascular patterns and scales between the 3 groups. We were not able to detect dermoscopic features that can help the clinician in distinguishing scalp contact dermatitis due to topical minoxidil from other conditions that cause severe scalp itching. In particular, minoxidil contact dermatitis does not produce increase or alterations in the morphology of the scalp vessels or significant scalp scaling when evaluated with dermoscopy.

  6. Fundamental studies of stress distributions and stress relaxation in oxide scales on high temperature alloys. [Final progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shores, D.A.; Stout, J.H.; Gerberich, W.W.

    1993-06-01

    This report summarizes a three-year study of stresses arising in the oxide scale and underlying metal during high temperature oxidation and of scale cracking. In-situ XRD was developed to measure strains during oxidation over 1000{degrees}C on pure metals. Acoustic emission was used to observe scale fracture during isothermal oxidation and cooling, and statistical analysis was used to infer mechanical aspects of cracking. A microscratch technique was used to measure the fracture toughness of scale/metal interface. A theoretical model was evaluated for the development and relaxation of stresses in scale and metal substrate during oxidation.

  7. The effect of OPC Factor on energy levels in healthy adults ages 45-65: a phase IIb randomized controlled trial.

    PubMed

    LaRiccia, Patrick J; Farrar, John T; Sammel, Mary D; Gallo, Joseph J

    2008-07-01

    To determine the efficacy of the food supplement OPC Factor to increase energy levels in healthy adults aged 45 to 65. Randomized, placebo-controlled, triple-blind crossover study. Twenty-five (25) healthy adults recruited from the University of Pennsylvania Health System. OPC Factor,trade mark (AlivenLabs, Lebanon, TN) a food supplement that contains oligomeric proanthocyanidins from grape seeds and pine bark along with other nutrient supplements including vitamins and minerals, was in the form of an effervescent powder. The placebo was similar in appearance and taste. Five outcome measurements were performed: (1) Energy subscale scores of the Activation-Deactivation Adjective Check List (AD ACL); (2) One (1) global question of percent energy change (Global Energy Percent Change); (3) One (1) global question of energy change measured on a Likert scale (Global Energy Scale Change); 4. One (1) global question of percent overall status change (Global Overall Status Percent Change); and (5) One (1) global question of overall status change measured on a Likert scale (Global Overall Status Scale Change). There were no carryover/period effects in the groups randomized to Placebo/Active Product sequence versus Active Product/Placebo sequence. Examination of the AD ACL Energy subscale scores for the Active Product versus Placebo comparison revealed no significant difference in the intention-to-treat (IT) analysis and the treatment received (TR) analysis. However, Global Energy Percent Change (p = 0.06) and Global Energy Scale Change (p = 0.09) both closely approached conventional levels of statistical significance for the active product in the IT analysis. Global Energy Percent Change (p = 0.05) and Global Energy Scale Change (p = 0.04) reached statistical significance in the TR analysis. A cumulative percent responders analysis graph indicated greater response rates for the active product. OPC Factor may increase energy levels in healthy adults aged 45-65 years. A larger study is recommended. Clinical Trials.gov identifier: NCT03318019.

  8. Statistical Models for the Analysis of Zero-Inflated Pain Intensity Numeric Rating Scale Data.

    PubMed

    Goulet, Joseph L; Buta, Eugenia; Bathulapalli, Harini; Gueorguieva, Ralitza; Brandt, Cynthia A

    2017-03-01

    Pain intensity is often measured in clinical and research settings using the 0 to 10 numeric rating scale (NRS). NRS scores are recorded as discrete values, and in some samples they may display a high proportion of zeroes and a right-skewed distribution. Despite this, statistical methods for normally distributed data are frequently used in the analysis of NRS data. We present results from an observational cross-sectional study examining the association of NRS scores with patient characteristics using data collected from a large cohort of 18,935 veterans in Department of Veterans Affairs care diagnosed with a potentially painful musculoskeletal disorder. The mean (variance) NRS pain was 3.0 (7.5), and 34% of patients reported no pain (NRS = 0). We compared the following statistical models for analyzing NRS scores: linear regression, generalized linear models (Poisson and negative binomial), zero-inflated and hurdle models for data with an excess of zeroes, and a cumulative logit model for ordinal data. We examined model fit, interpretability of results, and whether conclusions about the predictor effects changed across models. In this study, models that accommodate zero inflation provided a better fit than the other models. These models should be considered for the analysis of NRS data with a large proportion of zeroes. We examined and analyzed pain data from a large cohort of veterans with musculoskeletal disorders. We found that many reported no current pain on the NRS on the diagnosis date. We present several alternative statistical methods for the analysis of pain intensity data with a large proportion of zeroes. Published by Elsevier Inc.

  9. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  10. Allometric scaling: analysis of LD50 data.

    PubMed

    Burzala-Kowalczyk, Lidia; Jongbloed, Geurt

    2011-04-01

    The need to identify toxicologically equivalent doses across different species is a major issue in toxicology and risk assessment. In this article, we investigate interspecies scaling based on the allometric equation applied to the single, oral LD (50) data previously analyzed by Rhomberg and Wolff. We focus on the statistical approach, namely, regression analysis of the mentioned data. In contrast to Rhomberg and Wolff's analysis of species pairs, we perform an overall analysis based on the whole data set. From our study it follows that if one assumes one single scaling rule for all species and substances in the data set, then β = 1 is the most natural choice among a set of candidates known in the literature. In fact, we obtain quite narrow confidence intervals for this parameter. However, the estimate of the variance in the model is relatively high, resulting in rather wide prediction intervals. © 2010 Society for Risk Analysis.

  11. Synoptic scale forecast skill and systematic errors in the MASS 2.0 model. [Mesoscale Atmospheric Simulation System

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.

    1985-01-01

    The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.

  12. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE PAGES

    Angland, P.; Haberberger, D.; Ivancic, S. T.; ...

    2017-10-30

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  13. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angland, P.; Haberberger, D.; Ivancic, S. T.

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  14. Automated Box-Cox Transformations for Improved Visual Encoding.

    PubMed

    Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S

    2013-01-01

    The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.

  15. Executive Order 12898 and Social, Economic, and Sociopolitical Factors Influencing Toxic Release Inventory Facility Location in EPA Region 6: A Multi-Scale Spatial Assessment of Environmental Justice

    ERIC Educational Resources Information Center

    Moore, Andrea Lisa

    2013-01-01

    Toxic Release Inventory facilities are among the many environmental hazards shown to create environmental inequities in the United States. This project examined four factors associated with Toxic Release Inventory, specifically, manufacturing facility location at multiple spatial scales using spatial analysis techniques (i.e., O-ring statistic and…

  16. Study of Personnel Attrition and Revocation within U.S. Marine Corps Air Traffic Control Specialties

    DTIC Science & Technology

    2012-03-01

    Entrance Processing Stations (MEPS) and recruit depots, to include non-cognitive testing, such as Navy Computer Adaptive Personality Scales ( NCAPS ...Revocation, Selection, MOS, Regression, Probit, dProbit, STATA, Statistics, Marginal Effects, ASVAB, AFQT, Composite Scores, Screening, NCAPS 15. NUMBER...Navy Computer Adaptive Personality Scales ( NCAPS ), during recruitment. It is also recommended that an economic analysis be conducted comparing the

  17. New dimensions from statistical graphics for GIS (geographic information system) analysis and interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, R.A.; Olson, R.J.

    1988-01-01

    Environmental research and assessment activities at Oak Ridge National Laboratory (ORNL) include the analysis of spatial and temporal patterns of ecosystem response at a landscape scale. Analysis through use of geographic information system (GIS) involves an interaction between the user and thematic data sets frequently expressed as maps. A portion of GIS analysis has a mathematical or statistical aspect, especially for the analysis of temporal patterns. ARC/INFO is an excellent tool for manipulating GIS data and producing the appropriate map graphics. INFO also has some limited ability to produce statistical tabulation. At ORNL we have extended our capabilities by graphicallymore » interfacing ARC/INFO and SAS/GRAPH to provide a combined mapping and statistical graphics environment. With the data management, statistical, and graphics capabilities of SAS added to ARC/INFO, we have expanded the analytical and graphical dimensions of the GIS environment. Pie or bar charts, frequency curves, hydrographs, or scatter plots as produced by SAS can be added to maps from attribute data associated with ARC/INFO coverages. Numerous, small, simplified graphs can also become a source of complex map ''symbols.'' These additions extend the dimensions of GIS graphics to include time, details of the thematic composition, distribution, and interrelationships. 7 refs., 3 figs.« less

  18. In silico identification and comparative analysis of differentially expressed genes in human and mouse tissues

    PubMed Central

    Pao, Sheng-Ying; Lin, Win-Li; Hwang, Ming-Jing

    2006-01-01

    Background Screening for differentially expressed genes on the genomic scale and comparative analysis of the expression profiles of orthologous genes between species to study gene function and regulation are becoming increasingly feasible. Expressed sequence tags (ESTs) are an excellent source of data for such studies using bioinformatic approaches because of the rich libraries and tremendous amount of data now available in the public domain. However, any large-scale EST-based bioinformatics analysis must deal with the heterogeneous, and often ambiguous, tissue and organ terms used to describe EST libraries. Results To deal with the issue of tissue source, in this work, we carefully screened and organized more than 8 million human and mouse ESTs into 157 human and 108 mouse tissue/organ categories, to which we applied an established statistic test using different thresholds of the p value to identify genes differentially expressed in different tissues. Further analysis of the tissue distribution and level of expression of human and mouse orthologous genes showed that tissue-specific orthologs tended to have more similar expression patterns than those lacking significant tissue specificity. On the other hand, a number of orthologs were found to have significant disparity in their expression profiles, hinting at novel functions, divergent regulation, or new ortholog relationships. Conclusion Comprehensive statistics on the tissue-specific expression of human and mouse genes were obtained in this very large-scale, EST-based analysis. These statistical results have been organized into a database, freely accessible at our website , for easy searching of human and mouse tissue-specific genes and for investigating gene expression profiles in the context of comparative genomics. Comparative analysis showed that, although highly tissue-specific genes tend to exhibit similar expression profiles in human and mouse, there are significant exceptions, indicating that orthologous genes, while sharing basic genomic properties, could result in distinct phenotypes. PMID:16626500

  19. Large-angle correlations in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Efstathiou, George; Ma, Yin-Zhe; Hanson, Duncan

    2010-10-01

    It has been argued recently by Copi et al. 2009 that the lack of large angular correlations of the CMB temperature field provides strong evidence against the standard, statistically isotropic, inflationary Lambda cold dark matter (ΛCDM) cosmology. We compare various estimators of the temperature correlation function showing how they depend on assumptions of statistical isotropy and how they perform on the Wilkinson Microwave Anisotropy Probe (WMAP) 5-yr Internal Linear Combination (ILC) maps with and without a sky cut. We show that the low multipole harmonics that determine the large-scale features of the temperature correlation function can be reconstructed accurately from the data that lie outside the sky cuts. The reconstructions are only weakly dependent on the assumed statistical properties of the temperature field. The temperature correlation functions computed from these reconstructions are in good agreement with those computed from the ILC map over the whole sky. We conclude that the large-scale angular correlation function for our realization of the sky is well determined. A Bayesian analysis of the large-scale correlations is presented, which shows that the data cannot exclude the standard ΛCDM model. We discuss the differences between our results and those of Copi et al. Either there exists a violation of statistical isotropy as claimed by Copi et al., or these authors have overestimated the significance of the discrepancy because of a posteriori choices of estimator, statistic and sky cut.

  20. Scaling analysis of the non-Abelian quasiparticle tunneling in Z}}_k FQH states

    NASA Astrophysics Data System (ADS)

    Li, Qi; Jiang, Na; Wan, Xin; Hu, Zi-Xiang

    2018-06-01

    Quasiparticle tunneling between two counter propagating edges through point contacts could provide information on its statistics. Previous study of the short distance tunneling displays a scaling behavior, especially in the conformal limit with zero tunneling distance. The scaling exponents for the non-Abelian quasiparticle tunneling exhibit some non-trivial behaviors. In this work, we revisit the quasiparticle tunneling amplitudes and their scaling behavior in a full range of the tunneling distance by putting the electrons on the surface of a cylinder. The edge–edge distance can be smoothly tuned by varying the aspect ratio for a finite size cylinder. We analyze the scaling behavior of the quasiparticles for the Read–Rezayi states for and 4 both in the short and long tunneling distance region. The finite size scaling analysis automatically gives us a critical length scale where the anomalous correction appears. We demonstrate this length scale is related to the size of the quasiparticle at which the backscattering between two counter propagating edges starts to be significant.

  1. An Overview of Interrater Agreement on Likert Scales for Researchers and Practitioners

    PubMed Central

    O'Neill, Thomas A.

    2017-01-01

    Applications of interrater agreement (IRA) statistics for Likert scales are plentiful in research and practice. IRA may be implicated in job analysis, performance appraisal, panel interviews, and any other approach to gathering systematic observations. Any rating system involving subject-matter experts can also benefit from IRA as a measure of consensus. Further, IRA is fundamental to aggregation in multilevel research, which is becoming increasingly common in order to address nesting. Although, several technical descriptions of a few specific IRA statistics exist, this paper aims to provide a tractable orientation to common IRA indices to support application. The introductory overview is written with the intent of facilitating contrasts among IRA statistics by critically reviewing equations, interpretations, strengths, and weaknesses. Statistics considered include rwg, rwg*, r′wg, rwg(p), average deviation (AD), awg, standard deviation (Swg), and the coefficient of variation (CVwg). Equations support quick calculation and contrasting of different agreement indices. The article also includes a “quick reference” table and three figures in order to help readers identify how IRA statistics differ and how interpretations of IRA will depend strongly on the statistic employed. A brief consideration of recommended practices involving statistical and practical cutoff standards is presented, and conclusions are offered in light of the current literature. PMID:28553257

  2. Assessment of abuse liability of Tramadol among experienced drug users: Double-blind crossover randomized controlled trial.

    PubMed

    Das, Mrinmay; Jain, Raka; Dhawan, Anju; Kaur, Amandeep

    Tramadol is a widely used opioid analgesic. Different preclinical, clinical, and postmarketing surveillance studies show conflicting results regarding abuse potential of this drug. A randomized double-blind complete crossover study was conducted at National Drug Dependence Treatment Centre, All India Institute of Medical Sciences, New Delhi. Total subjects were 10, comprising total 120 observations (each subject assessed at baseline, 5, 45, and 240 minutes). Subjects with history of substance abuse were included after detoxification and informed consent. Assessment was done using modified single dose opiate questionnaire, morphine benzedrine group (MBG), pentobarbital chlorpromazine alcohol group (PCAG), and two bipolar visual analogue scales (VAS) after administration of three drugs-Tramadol (100 mg), Buprenorphine (0.6 mg), and Placebo (Normal Saline) intramuscularly, at 5-day interval. In intra-group analysis, there was statistically significant increase in scores of all four scales from baseline to all three time points after Tramadol and Buprenorphine administration. In inter-group analysis, statistically higher scores were seen for Buprenorphine in comparison to Tramadol at 5, 45, and 240 minutes for MBG scale; the score was significantly higher for Buprenorphine in VAS for pleasurable effect at 45 and 240 minutes, but not at baseline and 5 minutes. There was no significant difference in score at any point of time between Tramadol and Buprenorphine in PCAG scale and VAS for sedative/alertness effect. The scores were statistically insignificant in case of Placebo. All the subjects liked Buprenorphine most and then Tramadol followed by Placebo. Tramadol has abuse potential (even in therapeutic doses) more than Placebo but less than or comparable to Buprenorphine.

  3. Statistical analysis of microgravity experiment performance using the degrees of success scale

    NASA Technical Reports Server (NTRS)

    Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.

    1994-01-01

    This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.

  4. On Mars too, expect macroweather

    NASA Astrophysics Data System (ADS)

    Boisvert, Jean-Philippe; Lovejoy, Shaun; Muller, Jan-Peter

    2015-04-01

    Terrestrial atmospheric and oceanic spectra show drastic transitions at τw ˜ 10 days and τow ˜ 1 year respectively; this has been theorized as the lifetime of planetary scale structures. For wind and temperature, the forms of the low and high frequency parts of the spectra (macroweather, weather) as well as the τw can be theoretically estimated, the latter depending notably on the solar induced turbulent energy flux. We extend the theory to other planets and test it using Viking lander and reanalysis data from Mars. When the Martian spectra are scaled by the theoretical amount, they agree very well with their terrestrial atmospheric counterparts. Although the usual interpretation of Martian atmospheric dynamics is highly mechanistic (e.g. wave and tidal explanations are invoked), trace moment analysis of the reanalysis fields shows that the statistics well respect the predictions of multiplicative cascade theories. This shows that statistical scaling can be compatible with conventional deterministic thinking. However, since we are usually interested in statistical knowledge, it is the former not the latter that is of primary interest. We discuss the implications for understanding planetary fluid dynamical systems.

  5. Lagrangian single-particle turbulent statistics through the Hilbert-Huang transform.

    PubMed

    Huang, Yongxiang; Biferale, Luca; Calzavarini, Enrico; Sun, Chao; Toschi, Federico

    2013-04-01

    The Hilbert-Huang transform is applied to analyze single-particle Lagrangian velocity data from numerical simulations of hydrodynamic turbulence. The velocity trajectory is described in terms of a set of intrinsic mode functions C(i)(t) and of their instantaneous frequency ω(i)(t). On the basis of this decomposition we define the ω-conditioned statistical moments of the C(i) modes, named q-order Hilbert spectra (HS). We show that such quantities have enhanced scaling properties as compared to traditional Fourier transform- or correlation-based (structure functions) statistical indicators, thus providing better insights into the turbulent energy transfer process. We present clear empirical evidence that the energylike quantity, i.e., the second-order HS, displays a linear scaling in time in the inertial range, as expected from a dimensional analysis. We also measure high-order moment scaling exponents in a direct way, without resorting to the extended self-similarity procedure. This leads to an estimate of the Lagrangian structure function exponents which are consistent with the multifractal prediction in the Lagrangian frame as proposed by Biferale et al. [Phys. Rev. Lett. 93, 064502 (2004)].

  6. Time tracking and interaction of energy-eddies at different scales

    NASA Astrophysics Data System (ADS)

    Cardesa, Jose I.; Vela-Martin, Alberto; Jimenez, Javier

    2016-11-01

    We study the energy cascade through coherent structures obtained in time-resolved simulations of incompressible, statistically steady isotropic turbulence. The structures are defined as geometrically connected regions of the flow with high kinetic energy. We compute the latter by band-pass filtering the velocity field around a scale r. We analyse the dynamics of structures extracted with different r, which are a proxy for eddies containing energy at those r. We find that the size of these "energy-eddies" scales with r, while their lifetime scales with the local eddy-turnover r 2 / 3ɛ - 1 / 3 , where ɛ is the energy dissipation averaged over all space and time. Furthermore, a statistical analysis over the lives of the eddies shows a slight predominance of the splitting over the merging process. When we isolate the eddies which do not interact with other eddies of the same scale, we observe a parent-child dependence by which, on average, structures are born at scale r during the decaying part of the life of a structure at scale r' > r . The energy-eddy at r' lives in the same region of space as that at r. Finally, we investigate how interactions between eddies at the same scale are echoed across other scales. Funded by the ERC project Coturb.

  7. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  8. Resilience Scale-25 Spanish version: validation and assessment in eating disorders.

    PubMed

    Las Hayas, Carlota; Calvete, Esther; Gómez del Barrio, Andrés; Beato, Luís; Muñoz, Pedro; Padierna, Jesús Ángel

    2014-08-01

    To validate into Spanish the Wagnild and Young Resilience Scale - 25 (RS-25), assess and compare the scores on the scale among women from the general population, eating disorder (ED) patients and recovered ED patients. This is a cross-sectional study. ED participants were invited to participate by their respective therapists. The sample from the general population was gathered via an open online survey. Participants (N general population=279; N ED patients=124; and N recovered ED patients=45) completed the RS-25, the World Health Organization Quality of Life Scale-BREF and the Hospital Anxiety and Depression Scale. Mean age of participants ranged from 28.87 to 30.42years old. Statistical analysis included a multi-group confirmatory factor analysis and ANOVA. The two-factor model of the RS-25 produced excellent fit indexes. Measurement invariance across samples was generally supported. The ANOVA found statistically significant differences in the RS-25 mean scores between the ED patients (Mean=103.13, SD=31.32) and the recovered ED participants (Mean=138.42, SD=22.26) and between the ED patients and the general population participants (Mean=136.63, SD=19.56). The Spanish version of the RS-25 is a psychometrically sound measurement tool in samples of ED patients. Resilience is lower in people diagnosed with ED than in recovered individuals and the general population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Young adolescents' engagement in dietary behaviour - the impact of gender, socio-economic status, self-efficacy and scientific literacy. Methodological aspects of constructing measures in nutrition literacy research using the Rasch model.

    PubMed

    Guttersrud, Øystein; Petterson, Kjell Sverre

    2015-10-01

    The present study validates a revised scale measuring individuals' level of the 'engagement in dietary behaviour' aspect of 'critical nutrition literacy' and describes how background factors affect this aspect of Norwegian tenth-grade students' nutrition literacy. Data were gathered electronically during a field trial of a standardised sample test in science. Test items and questionnaire constructs were distributed evenly across four electronic field-test booklets. Data management and analysis were performed using the RUMM2030 item analysis package and the IBM SPSS Statistics 20 statistical software package. Students responded on computers at school. Seven hundred and forty tenth-grade students at twenty-seven randomly sampled public schools were enrolled in the field-test study. The engagement in dietary behaviour scale and the self-efficacy in science scale were distributed to 178 of these students. The dietary behaviour scale and the self-efficacy in science scale came out as valid, reliable and well-targeted instruments usable for the construction of measurements. Girls and students with high self-efficacy reported higher engagement in dietary behaviour than other students. Socio-economic status and scientific literacy - measured as ability in science by applying an achievement test - did not correlate significantly different from zero with students' engagement in dietary behaviour.

  10. The Dissipation Rate Transport Equation and Subgrid-Scale Models in Rotating Turbulence

    NASA Technical Reports Server (NTRS)

    Rubinstein, Robert; Ye, Zhou

    1997-01-01

    The dissipation rate transport equation remains the most uncertain part of turbulence modeling. The difficulties arc increased when external agencies like rotation prevent straightforward dimensional analysis from determining the correct form of the modelled equation. In this work, the dissipation rate transport equation and subgrid scale models for rotating turbulence are derived from an analytical statistical theory of rotating turbulence. In the strong rotation limit, the theory predicts a turbulent steady state in which the inertial range energy spectrum scales as k(sup -2) and the turbulent time scale is the inverse rotation rate. This scaling has been derived previously by heuristic arguments.

  11. From random microstructures to representative volume elements

    NASA Astrophysics Data System (ADS)

    Zeman, J.; Šejnoha, M.

    2007-06-01

    A unified treatment of random microstructures proposed in this contribution opens the way to efficient solutions of large-scale real world problems. The paper introduces a notion of statistically equivalent periodic unit cell (SEPUC) that replaces in a computational step the actual complex geometries on an arbitrary scale. A SEPUC is constructed such that its morphology conforms with images of real microstructures. Here, the appreciated two-point probability function and the lineal path function are employed to classify, from the statistical point of view, the geometrical arrangement of various material systems. Examples of statistically equivalent unit cells constructed for a unidirectional fibre tow, a plain weave textile composite and an irregular-coursed masonry wall are given. A specific result promoting the applicability of the SEPUC as a tool for the derivation of homogenized effective properties that are subsequently used in an independent macroscopic analysis is also presented.

  12. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  13. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  14. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  15. Confirmatory Factor Analysis of Persian Adaptation of Multidimensional Students' Life Satisfaction Scale (MSLSS)

    ERIC Educational Resources Information Center

    Hatami, Gissou; Motamed, Niloofar; Ashrafzadeh, Mahshid

    2010-01-01

    Validity and reliability of Persian adaptation of MSLSS in the 12-18 years, middle and high school students (430 students in grades 6-12 in Bushehr port, Iran) using confirmatory factor analysis by means of LISREL statistical package were checked. Internal consistency reliability estimates (Cronbach's coefficient [alpha]) were all above the…

  16. Using Multi-Group Confirmatory Factor Analysis to Evaluate Cross-Cultural Research: Identifying and Understanding Non-Invariance

    ERIC Educational Resources Information Center

    Brown, Gavin T. L.; Harris, Lois R.; O'Quin, Chrissie; Lane, Kenneth E.

    2017-01-01

    Multi-group confirmatory factor analysis (MGCFA) allows researchers to determine whether a research inventory elicits similar response patterns across samples. If statistical equivalence in responding is found, then scale score comparisons become possible and samples can be said to be from the same population. This paper illustrates the use of…

  17. Analysis of high-resolution foreign exchange data of USD-JPY for 13 years

    NASA Astrophysics Data System (ADS)

    Mizuno, Takayuki; Kurihara, Shoko; Takayasu, Misako; Takayasu, Hideki

    2003-06-01

    We analyze high-resolution foreign exchange data consisting of 20 million data points of USD-JPY for 13 years to report firm statistical laws in distributions and correlations of exchange rate fluctuations. A conditional probability density analysis clearly shows the existence of trend-following movements at time scale of 8-ticks, about 1 min.

  18. Effect of Variable Spatial Scales on USLE-GIS Computations

    NASA Astrophysics Data System (ADS)

    Patil, R. J.; Sharma, S. K.

    2017-12-01

    Use of appropriate spatial scale is very important in Universal Soil Loss Equation (USLE) based spatially distributed soil erosion modelling. This study aimed at assessment of annual rates of soil erosion at different spatial scales/grid sizes and analysing how changes in spatial scales affect USLE-GIS computations using simulation and statistical variabilities. Efforts have been made in this study to recommend an optimum spatial scale for further USLE-GIS computations for management and planning in the study area. The present research study was conducted in Shakkar River watershed, situated in Narsinghpur and Chhindwara districts of Madhya Pradesh, India. Remote Sensing and GIS techniques were integrated with Universal Soil Loss Equation (USLE) to predict spatial distribution of soil erosion in the study area at four different spatial scales viz; 30 m, 50 m, 100 m, and 200 m. Rainfall data, soil map, digital elevation model (DEM) and an executable C++ program, and satellite image of the area were used for preparation of the thematic maps for various USLE factors. Annual rates of soil erosion were estimated for 15 years (1992 to 2006) at four different grid sizes. The statistical analysis of four estimated datasets showed that sediment loss dataset at 30 m spatial scale has a minimum standard deviation (2.16), variance (4.68), percent deviation from observed values (2.68 - 18.91 %), and highest coefficient of determination (R2 = 0.874) among all the four datasets. Thus, it is recommended to adopt this spatial scale for USLE-GIS computations in the study area due to its minimum statistical variability and better agreement with the observed sediment loss data. This study also indicates large scope for use of finer spatial scales in spatially distributed soil erosion modelling.

  19. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  1. Turbulence Statistics of a Buoyant Jet in a Stratified Environment

    NASA Astrophysics Data System (ADS)

    McCleney, Amy Brooke

    Using non-intrusive optical diagnostics, turbulence statistics for a round, incompressible, buoyant, and vertical jet discharging freely into a stably linear stratified environment is studied and compared to a reference case of a neutrally buoyant jet in a uniform environment. This is part of a validation campaign for computational fluid dynamics (CFD). Buoyancy forces are known to significantly affect the jet evolution in a stratified environment. Despite their ubiquity in numerous natural and man-made flows, available data in these jets are limited, which constrain our understanding of the underlying physical processes. In particular, there is a dearth of velocity field data, which makes it challenging to validate numerical codes, currently used for modeling these important flows. Herein, jet near- and far-field behaviors are obtained with a combination of planar laser induced fluorescence (PLIF) and multi-scale time-resolved particle image velocimetry (TR-PIV) for Reynolds number up to 20,000. Deploying non-intrusive optical diagnostics in a variable density environment is challenging in liquids. The refractive index is strongly affected by the density, which introduces optical aberrations and occlusions that prevent the resolution of the flow. One solution consists of using index matched fluids with different densities. Here a pair of water solutions - isopropanol and NaCl - are identified that satisfy these requirements. In fact, they provide a density difference up to 5%, which is the largest reported for such fluid pairs. Additionally, by design, the kinematic viscosities of the solutions are identical. This greatly simplifies the analysis and subsequent simulations of the data. The spectral and temperature dependence of the solutions are fully characterized. In the near-field, shear layer roll-up is analyzed and characterized as a function of initial velocity profile. In the far-field, turbulence statistics are reported for two different scales, one capturing the entire jet at near Taylor microscale resolution, and the other, thanks to the careful refractive index matching of the liquids, resolving the Taylor scale at near Kolmogorov scale resolution. This is accomplished using a combination of TR-PIV and long-distance micro-PIV. The turbulence statistics obtained at various downstream locations and magnifications are obtained for density differences of 0%, 1%, and 3%. To validate the experimental methodology and provide a reference case for validation, the effect of initial velocity profile on the neutrally buoyant jet in the self-preserving regime is studied at two Reynolds numbers of 10,000 and 20,000. For the neutrally buoyant jet, it is found that independent of initial conditions the jet follows a self-similar behavior in the far-field; however, the spreading rate is strongly dependent on initial velocity profile. High magnification analysis at the small turbulent length scales shows a flow field where the mean statistics compare well to the larger field of view case. Investigation of the near-field shows the jet is strongly influenced by buoyancy, where an increase in vortex ring formation frequency and number of pairings occur. The buoyant jet with a 1% density difference shows an alteration of the centerline velocity decay, but the radial distribution of the mean axial velocity collapses well at all measurement locations. Jet formation dramatically changes for a buoyant jet with a 3% density difference, where the jet reaches a terminal height and spreads out horizontally at its neutral buoyancy location. Analysis of both the mean axial velocity and strain rates show the jet is no longer self-similar; for example, the mean centerline velocity does not decay uniformly as the jet develops. The centerline strain rates at this density difference also show trends which are strongly influenced by the altered centerline velocity. The overall centerline analysis shows that turbulence suppression occurs as a result of the stratification for both the 1% and 3% density difference. Analysis on the kinetic energy budget shows that the mean convection, production, transportation, and dissipation of energy is altered from stratification. High resolution data of the jet enable flow structures to be captured in the neutrally buoyant region of the flow. Vortices of different sizes are identified. Longer data sets are necessary to perform a statistical analysis of their distribution and to compare them to homogeneous environment case. This multi-scale analysis shows potential for studying energy transfer between length scales.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  3. Statistical Analysis of Instantaneous Frequency Scaling Factor as Derived From Optical Disdrometer Measurements At KQ Bands

    NASA Technical Reports Server (NTRS)

    Zemba, Michael; Nessel, James; Houts, Jacquelynne; Luini, Lorenzo; Riva, Carlo

    2016-01-01

    The rain rate data and statistics of a location are often used in conjunction with models to predict rain attenuation. However, the true attenuation is a function not only of rain rate, but also of the drop size distribution (DSD). Generally, models utilize an average drop size distribution (Laws and Parsons or Marshall and Palmer. However, individual rain events may deviate from these models significantly if their DSD is not well approximated by the average. Therefore, characterizing the relationship between the DSD and attenuation is valuable in improving modeled predictions of rain attenuation statistics. The DSD may also be used to derive the instantaneous frequency scaling factor and thus validate frequency scaling models. Since June of 2014, NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have jointly conducted a propagation study in Milan, Italy utilizing the 20 and 40 GHz beacon signals of the Alphasat TDP#5 Aldo Paraboni payload. The Ka- and Q-band beacon receivers provide a direct measurement of the signal attenuation while concurrent weather instrumentation provides measurements of the atmospheric conditions at the receiver. Among these instruments is a Thies Clima Laser Precipitation Monitor (optical disdrometer) which yields droplet size distributions (DSD); this DSD information can be used to derive a scaling factor that scales the measured 20 GHz data to expected 40 GHz attenuation. Given the capability to both predict and directly observe 40 GHz attenuation, this site is uniquely situated to assess and characterize such predictions. Previous work using this data has examined the relationship between the measured drop-size distribution and the measured attenuation of the link]. The focus of this paper now turns to a deeper analysis of the scaling factor, including the prediction error as a function of attenuation level, correlation between the scaling factor and the rain rate, and the temporal variability of the drop size distribution both within a given rain event and across different varieties of rain events. Index Terms-drop size distribution, frequency scaling, propagation losses, radiowave propagation.

  4. Statistical Analysis of Instantaneous Frequency Scaling Factor as Derived From Optical Disdrometer Measurements At KQ Bands

    NASA Technical Reports Server (NTRS)

    Zemba, Michael; Nessel, James; Houts, Jacquelynne; Luini, Lorenzo; Riva, Carlo

    2016-01-01

    The rain rate data and statistics of a location are often used in conjunction with models to predict rain attenuation. However, the true attenuation is a function not only of rain rate, but also of the drop size distribution (DSD). Generally, models utilize an average drop size distribution (Laws and Parsons or Marshall and Palmer [1]). However, individual rain events may deviate from these models significantly if their DSD is not well approximated by the average. Therefore, characterizing the relationship between the DSD and attenuation is valuable in improving modeled predictions of rain attenuation statistics. The DSD may also be used to derive the instantaneous frequency scaling factor and thus validate frequency scaling models. Since June of 2014, NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have jointly conducted a propagation study in Milan, Italy utilizing the 20 and 40 GHz beacon signals of the Alphasat TDP#5 Aldo Paraboni payload. The Ka- and Q-band beacon receivers provide a direct measurement of the signal attenuation while concurrent weather instrumentation provides measurements of the atmospheric conditions at the receiver. Among these instruments is a Thies Clima Laser Precipitation Monitor (optical disdrometer) which yields droplet size distributions (DSD); this DSD information can be used to derive a scaling factor that scales the measured 20 GHz data to expected 40 GHz attenuation. Given the capability to both predict and directly observe 40 GHz attenuation, this site is uniquely situated to assess and characterize such predictions. Previous work using this data has examined the relationship between the measured drop-size distribution and the measured attenuation of the link [2]. The focus of this paper now turns to a deeper analysis of the scaling factor, including the prediction error as a function of attenuation level, correlation between the scaling factor and the rain rate, and the temporal variability of the drop size distribution both within a given rain event and across different varieties of rain events. Index Terms-drop size distribution, frequency scaling, propagation losses, radiowave propagation.

  5. Memory matters: influence from a cognitive map on animal space use.

    PubMed

    Gautestad, Arild O

    2011-10-21

    A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Spousal Caregiver Burden and Its Relation with Disability in Schizophrenia

    PubMed Central

    Arun, R.; Inbakamal, S.; Tharyan, Anna; Premkumar, Prasanna S.

    2018-01-01

    Background: Schizophrenia, a chronic psychiatric disorder, can affect one's productivity and psychosocial functioning. In Indian context, the responsibility of caring persons with schizophrenia is increasingly on their spouses. Spousal caregiver experience and its relation with disability in schizophrenia need to be studied. Materials and Methods: We conducted a cross-sectional study among 52 outpatients with schizophrenia and their spouses attending a tertiary psychiatric center. The objectives were: (a) to explore spousal caregiver burden in schizophrenia and (b) to assess the relation between disability and spousal caregiver burden. The study adopted recommended ethical principles. Scales such as Burden Assessment Schedule, Indian Disability Evaluation and Assessment Scale (IDEAS), and Positive and Negative Syndrome Scale were used to collect appropriate data. Descriptive analysis, bivariate analysis, and multivariate analysis were done in SPSS software version 16.0. Results: The mean spousal caregiver burden score was 73.5 (standard deviation: 14.0). In bivariate analysis, disability, duration of schizophrenia, severity of schizophrenia, place of residence, and socioeconomic status had statistically significant relation with spousal caregiver burden. Adjusted for spouses’ age, gender, and other significant factors in bivariate analysis, the IDEAS global disability score (2.6, [confidence interval 0.5–3.8, P = 0.013]) retained statistically significant association with spousal caregiver burden. Conclusion: Spouses of persons with schizophrenia experience significant caregiver burden. Disability was found to be the most powerful determinant of spousal caregiver burden in the sample. Focus on disability alleviation in the management of schizophrenia may help reduce spousal caregiver burden. PMID:29403125

  7. A Preliminary Analysis of the Theoretical Parameters of Organizaational Learning.

    DTIC Science & Technology

    1995-09-01

    PARAMETERS OF ORGANIZATIONAL LEARNING THESIS Presented to the Faculty of the Graduate School of Logistics and Acquisition Management of the Air...Organizational Learning Parameters in the Knowledge Acquisition Category 2~™ 2-3. Organizational Learning Parameters in the Information Distribution Category...Learning Refined Scale 4-94 4-145. Composition of Refined Scale 4 Knowledge Flow 4-95 4-146. Cronbach’s Alpha Statistics for the Complete Knowledge Flow

  8. Precarious employment in Chile: psychometric properties of the Chilean version of Employment Precariousness Scale in private sector workers.

    PubMed

    Vives-Vergara, Alejandra; González-López, Francisca; Solar, Orielle; Bernales-Baksai, Pamela; González, María José; Benach, Joan

    2017-04-20

    The purpose of this study is to perform a psychometric analysis (acceptability, reliability and factor structure) of the Chilean version of the new Employment Precariousness Scale (EPRES). The data is drawn from a sample of 4,248 private salaried workers with a formal contract from the first Chilean Employment Conditions, Work, Health and Quality of Life (ENETS) survey, applied to a nationally representative sample of the Chilean workforce in 2010. Item and scale-level statistics were performed to assess scaling properties, acceptability and reliability. The six-dimensional factor structure was examined with confirmatory factor analysis. The scale exhibited high acceptability (roughly 80%) and reliability (Cronbach's alpha 0.83) and the factor structure was confirmed. One subscale (rights) demonstrated poorer metric properties without compromising the overall scale. The Chilean version of the Employment Precariousness Scale (EPRES-Ch) demonstrated good metric properties, pointing to its suitability for use in epidemiologic and public health research.

  9. Scalings of intermittent structures in magnetohydrodynamic turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhdankin, Vladimir, E-mail: zhdankin@jila.colorado.edu; Boldyrev, Stanislav; Space Science Institute, Boulder, Colorado 80301

    Turbulence is ubiquitous in plasmas, leading to rich dynamics characterized by irregularity, irreversibility, energy fluctuations across many scales, and energy transfer across many scales. Another fundamental and generic feature of turbulence, although sometimes overlooked, is the inhomogeneous dissipation of energy in space and in time. This is a consequence of intermittency, the scale-dependent inhomogeneity of dynamics caused by fluctuations in the turbulent cascade. Intermittency causes turbulent plasmas to self-organize into coherent dissipative structures, which may govern heating, diffusion, particle acceleration, and radiation emissions. In this paper, we present recent progress on understanding intermittency in incompressible magnetohydrodynamic turbulence with a strongmore » guide field. We focus on the statistical analysis of intermittent dissipative structures, which occupy a small fraction of the volume but arguably account for the majority of energy dissipation. We show that, in our numerical simulations, intermittent structures in the current density, vorticity, and Elsässer vorticities all have nearly identical statistical properties. We propose phenomenological explanations for the scalings based on general considerations of Elsässer vorticity structures. Finally, we examine the broader implications of intermittency for astrophysical systems.« less

  10. Ensuring Positiveness of the Scaled Difference Chi-Square Test Statistic

    ERIC Educational Resources Information Center

    Satorra, Albert; Bentler, Peter M.

    2010-01-01

    A scaled difference test statistic T[tilde][subscript d] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (Psychometrika 66:507-514, 2001). The statistic T[tilde][subscript d] is asymptotically equivalent to the scaled difference test statistic T[bar][subscript…

  11. A Study on Mutil-Scale Background Error Covariances in 3D-Var Data Assimilation

    NASA Astrophysics Data System (ADS)

    Zhang, Xubin; Tan, Zhe-Min

    2017-04-01

    The construction of background error covariances is a key component of three-dimensional variational data assimilation. There are different scale background errors and interactions among them in the numerical weather Prediction. However, the influence of these errors and their interactions cannot be represented in the background error covariances statistics when estimated by the leading methods. So, it is necessary to construct background error covariances influenced by multi-scale interactions among errors. With the NMC method, this article firstly estimates the background error covariances at given model-resolution scales. And then the information of errors whose scales are larger and smaller than the given ones is introduced respectively, using different nesting techniques, to estimate the corresponding covariances. The comparisons of three background error covariances statistics influenced by information of errors at different scales reveal that, the background error variances enhance particularly at large scales and higher levels when introducing the information of larger-scale errors by the lateral boundary condition provided by a lower-resolution model. On the other hand, the variances reduce at medium scales at the higher levels, while those show slight improvement at lower levels in the nested domain, especially at medium and small scales, when introducing the information of smaller-scale errors by nesting a higher-resolution model. In addition, the introduction of information of larger- (smaller-) scale errors leads to larger (smaller) horizontal and vertical correlation scales of background errors. Considering the multivariate correlations, the Ekman coupling increases (decreases) with the information of larger- (smaller-) scale errors included, whereas the geostrophic coupling in free atmosphere weakens in both situations. The three covariances obtained in above work are used in a data assimilation and model forecast system respectively, and then the analysis-forecast cycles for a period of 1 month are conducted. Through the comparison of both analyses and forecasts from this system, it is found that the trends for variation in analysis increments with information of different scale errors introduced are consistent with those for variation in variances and correlations of background errors. In particular, introduction of smaller-scale errors leads to larger amplitude of analysis increments for winds at medium scales at the height of both high- and low- level jet. And analysis increments for both temperature and humidity are greater at the corresponding scales at middle and upper levels under this circumstance. These analysis increments improve the intensity of jet-convection system which includes jets at different levels and coupling between them associated with latent heat release, and these changes in analyses contribute to the better forecasts for winds and temperature in the corresponding areas. When smaller-scale errors are included, analysis increments for humidity enhance significantly at large scales at lower levels to moisten southern analyses. This humidification devotes to correcting dry bias there and eventually improves forecast skill of humidity. Moreover, inclusion of larger- (smaller-) scale errors is beneficial for forecast quality of heavy (light) precipitation at large (small) scales due to the amplification (diminution) of intensity and area in precipitation forecasts but tends to overestimate (underestimate) light (heavy) precipitation .

  12. Statistical Model to Analyze Quantitative Proteomics Data Obtained by 18O/16O Labeling and Linear Ion Trap Mass Spectrometry

    PubMed Central

    Jorge, Inmaculada; Navarro, Pedro; Martínez-Acedo, Pablo; Núñez, Estefanía; Serrano, Horacio; Alfranca, Arántzazu; Redondo, Juan Miguel; Vázquez, Jesús

    2009-01-01

    Statistical models for the analysis of protein expression changes by stable isotope labeling are still poorly developed, particularly for data obtained by 16O/18O labeling. Besides large scale test experiments to validate the null hypothesis are lacking. Although the study of mechanisms underlying biological actions promoted by vascular endothelial growth factor (VEGF) on endothelial cells is of considerable interest, quantitative proteomics studies on this subject are scarce and have been performed after exposing cells to the factor for long periods of time. In this work we present the largest quantitative proteomics study to date on the short term effects of VEGF on human umbilical vein endothelial cells by 18O/16O labeling. Current statistical models based on normality and variance homogeneity were found unsuitable to describe the null hypothesis in a large scale test experiment performed on these cells, producing false expression changes. A random effects model was developed including four different sources of variance at the spectrum-fitting, scan, peptide, and protein levels. With the new model the number of outliers at scan and peptide levels was negligible in three large scale experiments, and only one false protein expression change was observed in the test experiment among more than 1000 proteins. The new model allowed the detection of significant protein expression changes upon VEGF stimulation for 4 and 8 h. The consistency of the changes observed at 4 h was confirmed by a replica at a smaller scale and further validated by Western blot analysis of some proteins. Most of the observed changes have not been described previously and are consistent with a pattern of protein expression that dynamically changes over time following the evolution of the angiogenic response. With this statistical model the 18O labeling approach emerges as a very promising and robust alternative to perform quantitative proteomics studies at a depth of several thousand proteins. PMID:19181660

  13. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    PubMed

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.

  14. Modes and emergent time scales of embayed beach dynamics

    NASA Astrophysics Data System (ADS)

    Ratliff, Katherine M.; Murray, A. Brad

    2014-10-01

    In this study, we use a simple numerical model (the Coastline Evolution Model) to explore alongshore transport-driven shoreline dynamics within generalized embayed beaches (neglecting cross-shore effects). Using principal component analysis (PCA), we identify two primary orthogonal modes of shoreline behavior that describe shoreline variation about its unchanging mean position: the rotation mode, which has been previously identified and describes changes in the mean shoreline orientation, and a newly identified breathing mode, which represents changes in shoreline curvature. Wavelet analysis of the PCA mode time series reveals characteristic time scales of these modes (typically years to decades) that emerge within even a statistically constant white-noise wave climate (without changes in external forcing), suggesting that these time scales can arise from internal system dynamics. The time scales of both modes increase linearly with shoreface depth, suggesting that the embayed beach sediment transport dynamics exhibit a diffusive scaling.

  15. Time-oriented hierarchical method for computation of principal components using subspace learning algorithm.

    PubMed

    Jankovic, Marko; Ogawa, Hidemitsu

    2004-10-01

    Principal Component Analysis (PCA) and Principal Subspace Analysis (PSA) are classic techniques in statistical data analysis, feature extraction and data compression. Given a set of multivariate measurements, PCA and PSA provide a smaller set of "basis vectors" with less redundancy, and a subspace spanned by them, respectively. Artificial neurons and neural networks have been shown to perform PSA and PCA when gradient ascent (descent) learning rules are used, which is related to the constrained maximization (minimization) of statistical objective functions. Due to their low complexity, such algorithms and their implementation in neural networks are potentially useful in cases of tracking slow changes of correlations in the input data or in updating eigenvectors with new samples. In this paper we propose PCA learning algorithm that is fully homogeneous with respect to neurons. The algorithm is obtained by modification of one of the most famous PSA learning algorithms--Subspace Learning Algorithm (SLA). Modification of the algorithm is based on Time-Oriented Hierarchical Method (TOHM). The method uses two distinct time scales. On a faster time scale PSA algorithm is responsible for the "behavior" of all output neurons. On a slower scale, output neurons will compete for fulfillment of their "own interests". On this scale, basis vectors in the principal subspace are rotated toward the principal eigenvectors. At the end of the paper it will be briefly analyzed how (or why) time-oriented hierarchical method can be used for transformation of any of the existing neural network PSA method, into PCA method.

  16. Development of a Multidimensional Functional Health Scale for Older Adults in China.

    PubMed

    Mao, Fanzhen; Han, Yaofeng; Chen, Junze; Chen, Wei; Yuan, Manqiong; Alicia Hong, Y; Fang, Ya

    2016-05-01

    A first step to achieve successful aging is assessing functional wellbeing of older adults. This study reports the development of a culturally appropriate brief scale (the Multidimensional Functional Health Scale for Chinese Elderly, MFHSCE) to assess the functional health of Chinese elderly. Through systematic literature review, Delphi method, cultural adaptation, synthetic statistical item selection, Cronbach's alpha and confirmatory factor analysis, we conducted development of item pool, two rounds of item selection, and psychometric evaluation. Synthetic statistical item selection and psychometric evaluation was processed among 539 and 2032 older adults, separately. The MFHSCE consists of 30 items, covering activities of daily living, social relationships, physical health, mental health, cognitive function, and economic resources. The Cronbach's alpha was 0.92, and the comparative fit index was 0.917. The MFHSCE has good internal consistency and construct validity; it is also concise and easy to use in general practice, especially in communities in China.

  17. Volatility return intervals analysis of the Japanese market

    NASA Astrophysics Data System (ADS)

    Jung, W.-S.; Wang, F. Z.; Havlin, S.; Kaizoji, T.; Moon, H.-T.; Stanley, H. E.

    2008-03-01

    We investigate scaling and memory effects in return intervals between price volatilities above a certain threshold q for the Japanese stock market using daily and intraday data sets. We find that the distribution of return intervals can be approximated by a scaling function that depends only on the ratio between the return interval τ and its mean <τ>. We also find memory effects such that a large (or small) return interval follows a large (or small) interval by investigating the conditional distribution and mean return interval. The results are similar to previous studies of other markets and indicate that similar statistical features appear in different financial markets. We also compare our results between the period before and after the big crash at the end of 1989. We find that scaling and memory effects of the return intervals show similar features although the statistical properties of the returns are different.

  18. Confirmatory Factor Analysis of the Malay Version of the Confusion, Hubbub and Order Scale (CHAOS-6) among Myocardial Infarction Survivors in a Malaysian Cardiac Healthcare Facility.

    PubMed

    Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul

    2017-08-01

    The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version's validity. Composite reliability was tested to determine the scale's reliability. All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6) , for the Malaysian population.

  19. ClinicAl Evaluation of Dental Restorative Materials

    DTIC Science & Technology

    1989-01-01

    use of an Atuarial Life Table Survival Analysis procedure. The median survival time for anterior composites was 13.5 years, as compared to 12.1 years...dental materials. For the first time in clinical biomaterials research, we used a statistical approach of Survival Analysis which utilized the... analysis has been established to assure uniformity in usage. This scale is now in use by clinical investigators throughout the country. Its use at the

  20. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    PubMed

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  1. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    NASA Astrophysics Data System (ADS)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  2. [Histologic assessment of tissue healing of hyaline cartilage by use of semiquantitative evaluation scale].

    PubMed

    Vukasović, Andreja; Ivković, Alan; Jezek, Davor; Cerovecki, Ivan; Vnuk, Drazen; Kreszinger, Mario; Hudetz, Damir; Pećina, Marko

    2011-01-01

    Articular cartilage is an avascular and aneural tissue lacking lymph drainage, hence its inability of spontaneous repair following injury. Thus, it offers an interesting model for scientific research. A number of methods have been suggested to enhance cartilage repair, but none has yet produced significant success. The possible application of the aforementioned methods has brought about the necessity to evaluate their results. The objective of this study was to analyze results of a study of the effects of the use of TGF-beta gene transduced bone marrow clot on articular cartilage defects using ICRS visual histological assessment scale. The research was conducted on 28 skeletally mature sheep that were randomly assigned to four groups and surgically inflicted femoral chondral defects. The articular surfaces were then treated with TGF-beta1 gene transduced bone marrow clot (TGF group), GFP transduced bone marrow clot (GFP group), untransduced bone marrow clot (BM group) or left untreated (NC group). The analysis was performed by visual examination of cartilage samples and results were obtained using ICRS visual histological assessment scale. The results were subsequently subjected to statistical assessment using Kruskal-Wallis and Mann-Whitney tests. Kruskal-Wallis test yielded statistically significant difference with respect to cell distribution. Mann-Whitney test showed statistically significant difference between TGF and NC groups (P = 0.002), as well as between BM and NC groups (P = 0.002 with Bonferroni correction). Twenty-six of the twenty-eight samples were subjected to histologic and subsequent statistical analysis; two were discarded due to faulty histology technique. Our results indicated a level of certainty as to the positive effect of TGF-beta1 gene transduced bone marrow clot in restoration of articular cartilage defects. However, additional research is necessary in the field. One of the significant drawbacks on histologic assessment of cartilage samples were the errors in histologic preparation, for which some samples had to be discarded and significantly impaired the analytical quality of the others. Defects of structures surrounding the articular cartilage, e.g., subchondral bone or connective tissue, might also impair the quality of histologic analysis. Additional analyses, i.e. polarizing microscopy should be performed to determine the degree of integration of the newly formed tissue with the surrounding cartilage. The semiquantitative ICRS scale, although of great practical value, has limitations as to the objectivity of the assessment, taking into account the analytical ability of the evaluator, as well as the accuracy of semiquantitative analysis in comparison to the methods of quantitative analysis. Overall results of histologic analysis indicated that the application of TGF-beta1 gene transduced bone marrow clot could have measurable clinical effects on articular cartilage repair. The ICRS visual histological assessment scale is a valuable analytical method for cartilage repair evaluation. In this respect, further analyses of the method value would be of great importance.

  3. Testing statistical isotropy in cosmic microwave background polarization maps

    NASA Astrophysics Data System (ADS)

    Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.

    2018-04-01

    We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.

  4. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  5. Assessment of change in dynamic psychotherapy.

    PubMed

    Høglend, P; Bøgwald, K P; Amlo, S; Heyerdahl, O; Sørbye, O; Marble, A; Sjaastad, M C; Bentsen, H

    2000-01-01

    Five scales have been developed to assess changes that are consistent with the therapeutic rationales and procedures of dynamic psychotherapy. Seven raters evaluated 50 patients before and 36 patients again after brief dynamic psychotherapy. A factor analysis indicated that the scales represent a dimension that is discriminable from general symptoms. A summary measure, Dynamic Capacity, was rated with acceptable reliability by a single rater. However, average scores of three raters were needed for good reliability of change ratings. The scales seem to be sufficiently fine-grained to capture statistically and clinically significant changes during brief dynamic psychotherapy.

  6. Atomic-scale phase composition through multivariate statistical analysis of atom probe tomography data.

    PubMed

    Keenan, Michael R; Smentkowski, Vincent S; Ulfig, Robert M; Oltman, Edward; Larson, David J; Kelly, Thomas F

    2011-06-01

    We demonstrate for the first time that multivariate statistical analysis techniques can be applied to atom probe tomography data to estimate the chemical composition of a sample at the full spatial resolution of the atom probe in three dimensions. Whereas the raw atom probe data provide the specific identity of an atom at a precise location, the multivariate results can be interpreted in terms of the probabilities that an atom representing a particular chemical phase is situated there. When aggregated to the size scale of a single atom (∼0.2 nm), atom probe spectral-image datasets are huge and extremely sparse. In fact, the average spectrum will have somewhat less than one total count per spectrum due to imperfect detection efficiency. These conditions, under which the variance in the data is completely dominated by counting noise, test the limits of multivariate analysis, and an extensive discussion of how to extract the chemical information is presented. Efficient numerical approaches to performing principal component analysis (PCA) on these datasets, which may number hundreds of millions of individual spectra, are put forward, and it is shown that PCA can be computed in a few seconds on a typical laptop computer.

  7. Development and Validation of a Teaching Practice Scale (TISS) for Instructors of Introductory Statistics at the College Level

    ERIC Educational Resources Information Center

    Hassad, Rossi A.

    2009-01-01

    This study examined the teaching practices of 227 college instructors of introductory statistics (from the health and behavioral sciences). Using primarily multidimensional scaling (MDS) techniques, a two-dimensional, 10-item teaching practice scale, TISS (Teaching of Introductory Statistics Scale), was developed and validated. The two dimensions…

  8. Influence of eye biometrics and corneal micro-structure on noncontact tonometry.

    PubMed

    Jesus, Danilo A; Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D Robert

    2017-01-01

    Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements.

  9. Influence of eye biometrics and corneal micro-structure on noncontact tonometry

    PubMed Central

    Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D. Robert

    2017-01-01

    Purpose Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Methods Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. Results In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. Conclusions We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements. PMID:28472178

  10. Development and Validation of the Caring Loneliness Scale.

    PubMed

    Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija

    2016-12-01

    The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.

  11. Psychometric evaluation of the Persian version of the Templer's Death Anxiety Scale in cancer patients.

    PubMed

    Soleimani, Mohammad Ali; Yaghoobzadeh, Ameneh; Bahrami, Nasim; Sharif, Saeed Pahlevan; Sharif Nia, Hamid

    2016-10-01

    In this study, 398 Iranian cancer patients completed the 15-item Templer's Death Anxiety Scale (TDAS). Tests of internal consistency, principal components analysis, and confirmatory factor analysis were conducted to assess the internal consistency and factorial validity of the Persian TDAS. The construct reliability statistic and average variance extracted were also calculated to measure construct reliability, convergent validity, and discriminant validity. Principal components analysis indicated a 3-component solution, which was generally supported in the confirmatory analysis. However, acceptable cutoffs for construct reliability, convergent validity, and discriminant validity were not fulfilled for the three subscales that were derived from the principal component analysis. This study demonstrated both the advantages and potential limitations of using the TDAS with Persian-speaking cancer patients.

  12. Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.

    PubMed

    Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo

    2015-11-01

    The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.

  13. The influence of depression and anxiety in the development of heart failure after coronary angioplasty.

    PubMed

    Gegenava, T; Gegenava, M; Kavtaradze, G

    2009-03-01

    The aim of our study was to investigate the association between history of depressive episode and anxiety and complications in patients after 6 months of coronary artery angioplasty. The research was conducted on 70 patients, the grade of coronary occlusion that would not respond to therapeutic treatment and need coronary angioplasty had been established. Complications were estimated in 60 patients after 6 months of coronary angioplasty. To evaluate depression we used Beck depression scale Anxiety was assessed by Spilberger State-trait anxiety scale. Statistic analysis of the data was made by means of the methods of variation statistics using Students' criterion and program of STATISTICA w 5.0. Complications were discovered in 36 (60%) patients; 24 (40%) patients had not complications. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. Our study demonstrated that complications were revealed in patients who had high degree of depression and anxiety.

  14. A Longitudinal Analysis of the Influence of a Peer Run Warm Line Phone Service on Psychiatric Recovery.

    PubMed

    Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J

    2018-05-01

    This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.

  15. MWASTools: an R/bioconductor package for metabolome-wide association studies.

    PubMed

    Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel

    2018-03-01

    MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version  > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Spatiotemporal characterization of soil moisture fields in agricultural areas using cosmic-ray neutron probes and data fusion

    NASA Astrophysics Data System (ADS)

    Franz, Trenton; Wang, Tiejun

    2015-04-01

    Approximately 40% of global food production comes from irrigated agriculture. With the increasing demand for food even greater pressures will be placed on water resources within these systems. In this work we aimed to characterize the spatial and temporal patterns of soil moisture at the field-scale (~500 m) using the newly developed cosmic-ray neutron rover near Waco, NE USA. Here we mapped soil moisture of 144 quarter section fields (a mix of maize, soybean, and natural areas) each week during the 2014 growing season (May to September). The 12 by 12 km study domain also contained three stationary cosmic-ray neutron probes for independent validation of the rover surveys. Basic statistical analysis of the domain indicated a strong relationship between the mean and variance of soil moisture at several averaging scales. The relationships between the mean and higher order moments were not significant. Scaling analysis indicated strong power law behavior between the variance of soil moisture and averaging area with minimal dependence of mean soil moisture on the slope of the power law function. In addition, we combined the data from the three stationary cosmic-ray neutron probes and mobile surveys using linear regression to derive a daily soil moisture product at 1, 3, and 12 km spatial resolutions for the entire growing season. The statistical relationships derived from the rover dataset offer a novel set of observations that will be useful in: 1) calibrating and validating land surface models, 2) calibrating and validating crop models, 3) soil moisture covariance estimates for statistical downscaling of remote sensing products such as SMOS and SMAP, and 4) provide daily center-pivot scale mean soil moisture data for optimal irrigation timing and volume amounts.

  17. Structures in magnetohydrodynamic turbulence: Detection and scaling

    NASA Astrophysics Data System (ADS)

    Uritsky, V. M.; Pouquet, A.; Rosenberg, D.; Mininni, P. D.; Donovan, E. F.

    2010-11-01

    We present a systematic analysis of statistical properties of turbulent current and vorticity structures at a given time using cluster analysis. The data stem from numerical simulations of decaying three-dimensional magnetohydrodynamic turbulence in the absence of an imposed uniform magnetic field; the magnetic Prandtl number is taken equal to unity, and we use a periodic box with grids of up to 15363 points and with Taylor Reynolds numbers up to 1100. The initial conditions are either an X -point configuration embedded in three dimensions, the so-called Orszag-Tang vortex, or an Arn’old-Beltrami-Childress configuration with a fully helical velocity and magnetic field. In each case two snapshots are analyzed, separated by one turn-over time, starting just after the peak of dissipation. We show that the algorithm is able to select a large number of structures (in excess of 8000) for each snapshot and that the statistical properties of these clusters are remarkably similar for the two snapshots as well as for the two flows under study in terms of scaling laws for the cluster characteristics, with the structures in the vorticity and in the current behaving in the same way. We also study the effect of Reynolds number on cluster statistics, and we finally analyze the properties of these clusters in terms of their velocity-magnetic-field correlation. Self-organized criticality features have been identified in the dissipative range of scales. A different scaling arises in the inertial range, which cannot be identified for the moment with a known self-organized criticality class consistent with magnetohydrodynamics. We suggest that this range can be governed by turbulence dynamics as opposed to criticality and propose an interpretation of intermittency in terms of propagation of local instabilities.

  18. Viewpoint: observations on scaled average bioequivalence.

    PubMed

    Patterson, Scott D; Jones, Byron

    2012-01-01

    The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Assessment of stigma associated with tuberculosis in Mexico.

    PubMed

    Moya, E M; Biswas, A; Chávez Baray, S M; Martínez, O; Lomeli, B

    2014-12-21

    Stigma is a major barrier to health care access and impacts the quality of life for individuals affected by tuberculosis (TB). Assessing TB stigma is essential to addressing health disparities. However, no such instrument was available in Mexico at the time of our study. This study examined the adaptability of the TB and human immunodeficiency virus (HIV) stigma scales previously used in Thailand. The original scale, developed in English, was linguistically adapted to Spanish and administered to 217 individuals affected by TB in five states in Mexico. The TB-HIV stigma subscales were designed to assess individual and community perspectives. Additional data collected included general information and socio-demographics. Assessment of psychometric properties included basic statistical tests, evaluation of Cronbach's alpha and factor analysis. We found no significant statistical differences associated with higher stigma scores by location, age, marital status, education and stigma scores. Factor analysis did not create any new factors. Internal consistency reliability coefficients were satisfactory (Cronbach α = 0.876-0.912). The use of the stigma scales has implications for 1) health improvements, 2) research on stigma and health disparities, and 3) TB and HIV stigma interventions. Further research is needed to examine transferability among larger and randomly selected Spanish-speaking populations.

  20. A Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An examination of four alternative models

    PubMed Central

    Vahedi, Shahram; Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM), proposed by Earp. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian adaptation of SAM. Results As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952530

  1. Understanding the k-5/3 to k-2.4 spectral break in aircraft wind data

    NASA Astrophysics Data System (ADS)

    Pinel, J.; Lovejoy, S.; Schertzer, D. J.; Tuck, A.

    2010-12-01

    A fundamental issue in atmospheric dynamics is to understand how the statistics of fluctuations of various fields vary with their space-time scale. The classical - and still “standard” model - dates back to Kraichnan and Charney’s work on 2-D and geostrophic (quasi 2-D) turbulence at the end of the 1960’s and early 1970’s. It postulates an isotropic 2-D turbulent regime at large scales and an isotropic 3D regime at small scales separated by a “dimensional transition” (once called a “mesoscale gap”) near the pressure scale height of ≈10 km. By the early 1980’s a quite different model emerged, the 23/9-D scaling model in which the dynamics were postulated to be dominated (over wide scale ranges) by a strongly anisotropic scale invariant cascade mechanism with structures becoming flatter and flatter at larger and larger scales in a scaling manner: the isotropy assumptions were discarded but the scaling and cascade assumptions retained. Today, thanks to the revolution in geodata and atmospheric models - both in quality and quantity - the 23/9-D model can explain the observed horizontal cascade structures in remotely sensed radiances, in meteorological “reanalyses”, in meteorological models, in high resolution drop sonde vertical analyses, of lidar vertical sections etc. All of these analyses directly contradict the standard model which predicts drastic “dimensional transitions” for scalar quantities. Indeed, until recently the only unexplained feature was a scale break in aircraft spectra of the (vector) horizontal wind somewhere between about 40 and 200 km. However - contrary to repeated claims - and thanks to a reanalysis of the historical papers - the transition that had been observed since the 1980’s was not between k^-5/3 and k^-3 but rather between k^-5/3 and k^-2.4. By 2009, the standard model was thus hanging by a thread. This was cut when careful analysis of scientific aircraft data allowed the 23/9-D model to explain the large scale k-2.4 regime as an artefact of the aircraft following a sloping trajectory: at large enough scales, the spectrum is simply dominated by vertical rather than horizontal fluctuations which have the required k^-2.4 form. Since aircraft frequently follow gently sloping isobars, this neatly explains the last obstacle to wide range anisotropic scaling models finally opening the door to an urgently needed consensus on the statistical structure of the atmosphere. However, objections remain: at large enough scales do isobaric and isoheight spectra really have different exponents? In this presentation we attempted to study this issue in more detail than before by analyzed data measured by commercial aircrafts through the Tropospheric Airborne Meteorological Data Reporting (TAMDAR) system over CONUS during year 2009. The TAMDAR system allows us to calculate the statistical properties of the wind field on constant pressure and altitude levels. Various statistical exponents were calculated (velocity increment in terms of horizontal, vertical displacement, pressure and time) and we show here what we learned and how this analysis can help with solving this question.

  2. Scaling analysis of the non-Abelian quasiparticle tunneling in [Formula: see text] FQH states.

    PubMed

    Li, Qi; Jiang, Na; Wan, Xin; Hu, Zi-Xiang

    2018-06-27

    Quasiparticle tunneling between two counter propagating edges through point contacts could provide information on its statistics. Previous study of the short distance tunneling displays a scaling behavior, especially in the conformal limit with zero tunneling distance. The scaling exponents for the non-Abelian quasiparticle tunneling exhibit some non-trivial behaviors. In this work, we revisit the quasiparticle tunneling amplitudes and their scaling behavior in a full range of the tunneling distance by putting the electrons on the surface of a cylinder. The edge-edge distance can be smoothly tuned by varying the aspect ratio for a finite size cylinder. We analyze the scaling behavior of the quasiparticles for the Read-Rezayi [Formula: see text] states for [Formula: see text] and 4 both in the short and long tunneling distance region. The finite size scaling analysis automatically gives us a critical length scale where the anomalous correction appears. We demonstrate this length scale is related to the size of the quasiparticle at which the backscattering between two counter propagating edges starts to be significant.

  3. A Stochastic Model of Space-Time Variability of Mesoscale Rainfall: Statistics of Spatial Averages

    NASA Technical Reports Server (NTRS)

    Kundu, Prasun K.; Bell, Thomas L.

    2003-01-01

    A characteristic feature of rainfall statistics is that they depend on the space and time scales over which rain data are averaged. A previously developed spectral model of rain statistics that is designed to capture this property, predicts power law scaling behavior for the second moment statistics of area-averaged rain rate on the averaging length scale L as L right arrow 0. In the present work a more efficient method of estimating the model parameters is presented, and used to fit the model to the statistics of area-averaged rain rate derived from gridded radar precipitation data from TOGA COARE. Statistical properties of the data and the model predictions are compared over a wide range of averaging scales. An extension of the spectral model scaling relations to describe the dependence of the average fraction of grid boxes within an area containing nonzero rain (the "rainy area fraction") on the grid scale L is also explored.

  4. Meta-analysis of gene-level associations for rare variants based on single-variant statistics.

    PubMed

    Hu, Yi-Juan; Berndt, Sonja I; Gustafsson, Stefan; Ganna, Andrea; Hirschhorn, Joel; North, Kari E; Ingelsson, Erik; Lin, Dan-Yu

    2013-08-08

    Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  5. The Effects of Perioperative Music Interventions in Pediatric Surgery: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    PubMed

    van der Heijden, Marianne J E; Oliai Araghi, Sadaf; van Dijk, Monique; Jeekel, Johannes; Hunink, M G Myriam

    2015-01-01

    Music interventions are widely used, but have not yet gained a place in guidelines for pediatric surgery or pediatric anesthesia. In this systematic review and meta-analysis we examined the effects of music interventions on pain, anxiety and distress in children undergoing invasive surgery. We searched 25 electronic databases from their first available date until October 2014. Included were all randomized controlled trials with a parallel group, crossover or cluster design that included pediatric patients from 1 month to 18 years old undergoing minimally invasive or invasive surgical procedures, and receiving either live music therapy or recorded music. 4846 records were retrieved from the searches, 26 full text reports were evaluated and data was extracted by two independent investigators. Pain was measured with the Visual Analogue Scale, the Coloured Analogue Scale and the Facial Pain Scale. Anxiety and distress were measured with an emotional index scale (not validated), the Spielberger short State Trait Anxiety Inventory and a Facial Affective Scale. Three RCTs were eligible for inclusion encompassing 196 orthopedic, cardiac and day surgery patients (age of 1 day to 18 years) receiving either live music therapy or recorded music. Overall a statistically significant positive effect was demonstrated on postoperative pain (SMD -1.07; 95%CI-2.08; -0.07) and on anxiety and distress (SMD -0.34 95% CI -0.66; -0.01 and SMD -0.50; 95% CI -0.84; - 0.16. This systematic review and meta-analysis indicates that music interventions may have a statistically significant effect in reducing post-operative pain, anxiety and distress in children undergoing a surgical procedure. Evidence from this review and other reviews suggests music therapy may be considered for clinical use.

  6. Improvement of burn pain management through routine pain monitoring and pain management protocol.

    PubMed

    Yang, Hyeong Tae; Hur, Giyeun; Kwak, In-Suk; Yim, Haejun; Cho, Yong Suk; Kim, Dohern; Hur, Jun; Kim, Jong Hyun; Lee, Boung Chul; Seo, Cheong Hoon; Chun, Wook

    2013-06-01

    Pain management is an important aspect of burn management. We developed a routine pain monitoring system and pain management protocol for burn patients. The purpose of this study is to evaluate the effectiveness of our new pain management system. From May 2011 to November 2011, the prospective study was performed with 107 burn patients. We performed control group (n=58) data analysis and then developed the pain management protocol and monitoring system. Next, we applied our protocol to patients and performed protocol group (n=49) data analysis, and compared this to control group data. Data analysis was performed using the Numeric Rating Scale (NRS) of background pain and procedural pain, Clinician-Administered PTSD Scale (CAPS), Hamilton Depression Rating Scale (HDRS), State-Trait Anxiety Inventory Scale (STAIS), and Holmes and Rahe Stress Scale (HRSS). The NRS of background pain for the protocol group was significantly decreased compared to the control group (2.8±2.0 versus 3.9±1.9), and the NRS of procedural pain of the protocol group was significantly decreased compared to the control group (4.8±2.8 versus 3.7±2.5). CAPS and HDRS were decreased in the protocol group, but did not have statistical significance. STAIS and HRSS were decreased in the protocol group, but only the STAIS had statistical significance. Our new pain management system was effective in burn pain management. However, adequate pain management can only be accomplished by a continuous and thorough effort. Therefore, pain control protocol and pain monitoring systems need to be under constant revision and improvement using creative ideas and approaches. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.

  7. African-American college student attitudes toward physics and their effect on achievement

    NASA Astrophysics Data System (ADS)

    Drake, Carl Timothy

    The purpose of this study was to investigate factors affecting the attitudes that African-American college students have towards introductory college physics. The population targeted for this study consisted of African-American males and females enrolled in introductory college physics classes at an urban public historical black college or university (HBCU) located in the southeastern United States. Nine of the Fennema-Sherman Mathematics Attitude Scales, modified for physics, were used to analyze the attitudes of the 135 participants enrolled in an introductory college physics class. The nine scales used to measure the students' attitudes were Attitude Toward Success in Physics Scale (AS), The Physics as a Male Domain Scale (MD), The Mother Scale (M), The Father Scale (F), The Teacher Scale (T), The Confidence in Learning Physics Scale (C), The Physics Anxiety Scale (A), The Effectance Motivation Scale in Physics (E), and The Physics Usefulness Scale (U). Hypothesis I states that there is a significant difference in the domain scores of African-American college students in the Fennema-Sherman Math Attitudes Scales adapted for physics. It was found using a repeated measures ANOVA that there was a significant difference between the attitudes of African-Americans on the nine attitude scales of the Fennema-Sherman Math Attitude Scales, F(8,992) = 43.09, p < .001. Hypothesis II states that there is a statistically significant difference in domain scores between African-American males and African-American females in the Fennema-Sherman Attitude Scales. It was found using a MANOVA that there was not a significant difference between the domain scores of African-American males and African-American females, F(8, 116) = .38, p > .05. Hypothesis III states that there is a statistically significant relationship between attitude towards physics and achievement for African-American students. The students with good attitudes toward physics would have a higher level of achievement. The multiple linear regression analysis revealed that there was a significant relationship between a good attitude toward physics and achievement in the class. The result of the analysis implied that 18.9% of the grade could be explained by the domain scales.

  8. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  9. Temporal multiscaling characteristics of particulate matter PM 10 and ground-level ozone O3 concentrations in Caribbean region

    NASA Astrophysics Data System (ADS)

    Plocoste, Thomas; Calif, Rudy; Jacoby-Koaly, Sandra

    2017-11-01

    A good knowledge of the intermittency of atmospheric pollutants is crucial for air pollution management. We consider here particulate matter PM 10 and ground-level ozone O3 time series in Guadeloupe archipelago which experiments a tropical and humid climate in the Caribbean zone. The aim of this paper is to study their scaling statistics in the framework of fully developed turbulence and Kolmogorov's theory. Firstly, we estimate their Fourier power spectra and consider their scaling properties in the physical space. The power spectra computed follows a power law behavior for both considered pollutants. Thereafter we study the scaling behavior of PM 10 and O3 time series. Contrary to numerous studies where the multifractal detrended fluctuation analysis is frequently applied, here, the classical structure function analysis is used to extract the scaling exponent or multifractal spectrum ζ(q) ; this function provides a full characterization of a process at all intensities and all scales. The obtained results show that PM 10 and O3 possess intermittent and multifractal properties. The singularity spectrum MS(α) also confirms both pollutants multifractal features. The originality of this work comes from a statistical modeling performed on ζ(q) and MS(α) by a lognormal model to compute the intermittency parameter μ. By contrast with PM 10 which mainly depends on puffs of Saharan dust (synoptic-scale), O3 is more intermittent due to variability of its local precursors. The results presented in this paper can help to better understand the mechanisms governing the dynamics of PM 10 and O3 in Caribbean islands context.

  10. Mathematical Analysis of Vehicle Delivery Scale of Bike-Sharing Rental Nodes

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Liu, J.; Liu, L.

    2018-04-01

    Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state) probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  11. Estimating monetary damages from flooding in the United States under a changing climate

    EPA Science Inventory

    A national-scale analysis of potential changes in monetary damages from flooding under climate change. The approach uses empirically based statistical relationships between historical precipitation and flood damage records from 18 hydrologic regions of the United States, along w...

  12. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  13. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    NASA Astrophysics Data System (ADS)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  14. Triacylglycerol Analysis in Human Milk and Other Mammalian Species: Small-Scale Sample Preparation, Characterization, and Statistical Classification Using HPLC-ELSD Profiles.

    PubMed

    Ten-Doménech, Isabel; Beltrán-Iturat, Eduardo; Herrero-Martínez, José Manuel; Sancho-Llopis, Juan Vicente; Simó-Alfonso, Ernesto Francisco

    2015-06-24

    In this work, a method for the separation of triacylglycerols (TAGs) present in human milk and from other mammalian species by reversed-phase high-performance liquid chromatography using a core-shell particle packed column with UV and evaporative light-scattering detectors is described. Under optimal conditions, a mobile phase containing acetonitrile/n-pentanol at 10 °C gave an excellent resolution among more than 50 TAG peaks. A small-scale method for fat extraction in these milks (particularly of interest for human milk samples) using minimal amounts of sample and reagents was also developed. The proposed extraction protocol and the traditional method were compared, giving similar results, with respect to the total fat and relative TAG contents. Finally, a statistical study based on linear discriminant analysis on the TAG composition of different types of milks (human, cow, sheep, and goat) was carried out to differentiate the samples according to their mammalian origin.

  15. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Shantaram, S. Pai; Gyekenyesi, John P.

    1989-01-01

    The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  16. Rasch analysis for psychometric improvement of science attitude rating scales

    NASA Astrophysics Data System (ADS)

    Oon, Pey-Tee; Fan, Xitao

    2017-04-01

    Students' attitude towards science (SAS) is often a subject of investigation in science education research. Survey of rating scale is commonly used in the study of SAS. The present study illustrates how Rasch analysis can be used to provide psychometric information of SAS rating scales. The analyses were conducted on a 20-item SAS scale used in an existing dataset of The Trends in International Mathematics and Science Study (TIMSS) (2011). Data of all the eight-grade participants from Hong Kong and Singapore (N = 9942) were retrieved for analyses. Additional insights from Rasch analysis that are not commonly available from conventional test and item analyses were discussed, such as invariance measurement of SAS, unidimensionality of SAS construct, optimum utilization of SAS rating categories, and item difficulty hierarchy in the SAS scale. Recommendations on how TIMSS items on the measurement of SAS can be better designed were discussed. The study also highlights the importance of using Rasch estimates for statistical parametric tests (e.g. ANOVA, t-test) that are common in science education research for group comparisons.

  17. Influence of the time scale on the construction of financial networks.

    PubMed

    Emmert-Streib, Frank; Dehmer, Matthias

    2010-09-30

    In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis.

  18. Spatially explicit spectral analysis of point clouds and geospatial data

    USGS Publications Warehouse

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described, and its functionality illustrated with an example of a high-resolution bathymetric point cloud data collected with multibeam echosounder.

  19. Multivariate analysis of heavy metal contamination using river sediment cores of Nankan River, northern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, An-Sheng; Lu, Wei-Li; Huang, Jyh-Jaan; Chang, Queenie; Wei, Kuo-Yen; Lin, Chin-Jung; Liou, Sofia Ya Hsuan

    2016-04-01

    Through the geology and climate characteristic in Taiwan, generally rivers carry a lot of suspended particles. After these particles settled, they become sediments which are good sorbent for heavy metals in river system. Consequently, sediments can be found recording contamination footprint at low flow energy region, such as estuary. Seven sediment cores were collected along Nankan River, northern Taiwan, which is seriously contaminated by factory, household and agriculture input. Physico-chemical properties of these cores were derived from Itrax-XRF Core Scanner and grain size analysis. In order to interpret these complex data matrices, the multivariate statistical techniques (cluster analysis, factor analysis and discriminant analysis) were introduced to this study. Through the statistical determination, the result indicates four types of sediment. One of them represents contamination event which shows high concentration of Cu, Zn, Pb, Ni and Fe, and low concentration of Si and Zr. Furthermore, three possible contamination sources of this type of sediment were revealed by Factor Analysis. The combination of sediment analysis and multivariate statistical techniques used provides new insights into the contamination depositional history of Nankan River and could be similarly applied to other river systems to determine the scale of anthropogenic contamination.

  20. Universal sequence map (USM) of arbitrary discrete sequences

    PubMed Central

    2002-01-01

    Background For over a decade the idea of representing biological sequences in a continuous coordinate space has maintained its appeal but not been fully realized. The basic idea is that any sequence of symbols may define trajectories in the continuous space conserving all its statistical properties. Ideally, such a representation would allow scale independent sequence analysis – without the context of fixed memory length. A simple example would consist on being able to infer the homology between two sequences solely by comparing the coordinates of any two homologous units. Results We have successfully identified such an iterative function for bijective mappingψ of discrete sequences into objects of continuous state space that enable scale-independent sequence analysis. The technique, named Universal Sequence Mapping (USM), is applicable to sequences with an arbitrary length and arbitrary number of unique units and generates a representation where map distance estimates sequence similarity. The novel USM procedure is based on earlier work by these and other authors on the properties of Chaos Game Representation (CGR). The latter enables the representation of 4 unit type sequences (like DNA) as an order free Markov Chain transition table. The properties of USM are illustrated with test data and can be verified for other data by using the accompanying web-based tool:http://bioinformatics.musc.edu/~jonas/usm/. Conclusions USM is shown to enable a statistical mechanics approach to sequence analysis. The scale independent representation frees sequence analysis from the need to assume a memory length in the investigation of syntactic rules. PMID:11895567

  1. Measurement of self-evaluative motives: a shopping scenario.

    PubMed

    Wajda, Theresa A; Kolbe, Richard; Hu, Michael Y; Cui, Annie Peng

    2008-08-01

    To develop measures of consumers' self-evaluative motives of Self-verification, Self-enhancement, and Self-improvement within the context of a mall shopping environment, an initial set of 49 items was generated by conducting three focus-group sessions. These items were subsequently converted into shopping-dependent motive statements. 250 undergraduate college students responded on a 7-point scale to each statement as these related to the acquisition of recent personal shopping goods. An exploratory factor analysis yielded five factors, accounting for 57.7% of the variance, three of which corresponded to the Self-verification motive (five items), Self-enhancement motive (three items), and Self-improvement motive (six items). These 14 items, along with 9 reconstructed items, yielded 23 items retained and subjected to additional testing. In a final round of data collection, 169 college students provided data for exploratory factor analysis. 11 items were used in confirmatory factor analysis. Analysis indicated that the 11-item scale adequately captured measures of the three self-evaluative motives. However, further data reduction produced a 9-item scale with marked improvement in statistical fit over the 11-item scale.

  2. Statistical analysis of time transfer data from Timation 2. [US Naval Observatory and Australia

    NASA Technical Reports Server (NTRS)

    Luck, J. M.; Morgan, P.

    1974-01-01

    Between July 1973 and January 1974, three time transfer experiments using the Timation 2 satellite were conducted to measure time differences between the U.S. Naval Observatory and Australia. Statistical tests showed that the results are unaffected by the satellite's position with respect to the sunrise/sunset line or by its closest approach azimuth at the Australian station. Further tests revealed that forward predictions of time scale differences, based on the measurements, can be made with high confidence.

  3. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  4. Accounting for standard errors of vision-specific latent trait in regression models.

    PubMed

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  5. Simulating statistics of lightning-induced and man made fires

    NASA Astrophysics Data System (ADS)

    Krenn, R.; Hergarten, S.

    2009-04-01

    The frequency-area distributions of forest fires show power-law behavior with scaling exponents α in a quite narrow range, relating wildfire research to the theoretical framework of self-organized criticality. Examples of self-organized critical behavior can be found in computer simulations of simple cellular automata. The established self-organized critical Drossel-Schwabl forest fire model (DS-FFM) is one of the most widespread models in this context. Despite its qualitative agreement with event-size statistics from nature, its applicability is still questioned. Apart from general concerns that the DS-FFM apparently oversimplifies the complex nature of forest dynamics, it significantly overestimates the frequency of large fires. We present a straightforward modification of the model rules that increases the scaling exponent α by approximately 1•3 and brings the simulated event-size statistics close to those observed in nature. In addition, combined simulations of both the original and the modified model predict a dependence of the overall distribution on the ratio of lightning induced and man made fires as well as a difference between their respective event-size statistics. The increase of the scaling exponent with decreasing lightning probability as well as the splitting of the partial distributions are confirmed by the analysis of the Canadian Large Fire Database. As a consequence, lightning induced and man made forest fires cannot be treated separately in wildfire modeling, hazard assessment and forest management.

  6. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  7. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE PAGES

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...

    2017-05-15

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  8. Spline analysis of the mandible in human subjects with class III malocclusion.

    PubMed

    Singh, G D; McNamara, J A; Lozanoff, S

    1997-05-01

    This study determines deformations that contribute to a Class III mandibular morphology, employing thin-plate spline (TPS) analysis. A total of 133 lateral cephalographs of prepubertal children of European-American descent with either a Class I molar occlusion or a Class III malocclusion were compared. The cephalographs were traced and checked, and eight homologous landmarks on the mandible were identified and digitized. The datasets were scaled to an equivalent size and subjected to statistical analyses. These tests indicated significant differences between average Class I and Class III mandibular morphologies. When the sample was subdivided into seven age and sex-matched groups statistical differences were maintained for each group. TPS analysis indicated that both affine (uniform) and non-affine transformations contribute towards the total spline, and towards the average mandibular morphology at each age group. For non-affine transformations, partial warp 5 had the highest magnitude, indicating large-scale deformations of the mandibular configuration between articulare and pogonion. In contrast, partial warp 1 indicated localized shape changes in the mandibular symphyseal region. It is concluded that large spatial-scale deformations affect the body of the mandible, in combination with localized distortions further anteriorly. These deformations may represent a developmental elongation of the mandibular corpus antero-posteriorly that, allied with symphyseal changes, leads to the appearance of a Class III prognathic mandibular profile.

  9. Multi-time-scale hydroclimate dynamics of a regional watershed and links to large-scale atmospheric circulation: Application to the Seine river catchment, France

    NASA Astrophysics Data System (ADS)

    Massei, N.; Dieppois, B.; Hannah, D. M.; Lavers, D. A.; Fossa, M.; Laignel, B.; Debret, M.

    2017-03-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating correlation between large and local scales, empirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: (i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and (ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the links between large and local scales were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach, which integrated discrete wavelet multiresolution analysis for reconstructing monthly regional hydrometeorological processes (predictand: precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector). This approach basically consisted in three steps: 1 - decomposing large-scale climate and hydrological signals (SLP field, precipitation or streamflow) using discrete wavelet multiresolution analysis, 2 - generating a statistical downscaling model per time-scale, 3 - summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either precipitation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with alternating flood and extremely low-flow/drought periods (e.g., winter/spring 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. In accordance with previous studies, the wavelet components detected in SLP, precipitation and streamflow on interannual to interdecadal time-scales could be interpreted in terms of influence of the Gulf-Stream oceanic front on atmospheric circulation.

  10. Analysis of the fluctuations of the tumour/host interface

    NASA Astrophysics Data System (ADS)

    Milotti, Edoardo; Vyshemirsky, Vladislav; Stella, Sabrina; Dogo, Federico; Chignola, Roberto

    2017-11-01

    In a recent analysis of metabolic scaling in solid tumours we found a scaling law that interpolates between the power laws μ ∝ V and μ ∝V 2 / 3, where μ is the metabolic rate expressed as the glucose absorption rate and V is the tumour volume. The scaling law fits quite well both in vitro and in vivo data, however we also observed marked fluctuations that are associated with the specific biological properties of individual tumours. Here we analyse these fluctuations, in an attempt to find the population-wide distribution of an important parameter (A) which expresses the total extent of the interface between the solid tumour and the non-cancerous environment. Heuristic considerations suggest that the values of the A parameter follow a lognormal distribution, and, allowing for the large uncertainties of the experimental data, our statistical analysis confirms this.

  11. Nonlinear filtering properties of detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Kiyono, Ken; Tsujimoto, Yutaka

    2016-11-01

    Detrended fluctuation analysis (DFA) has been widely used for quantifying long-range correlation and fractal scaling behavior. In DFA, to avoid spurious detection of scaling behavior caused by a nonstationary trend embedded in the analyzed time series, a detrending procedure using piecewise least-squares fitting has been applied. However, it has been pointed out that the nonlinear filtering properties involved with detrending may induce instabilities in the scaling exponent estimation. To understand this issue, we investigate the adverse effects of the DFA detrending procedure on the statistical estimation. We show that the detrending procedure using piecewise least-squares fitting results in the nonuniformly weighted estimation of the root-mean-square deviation and that this property could induce an increase in the estimation error. In addition, for comparison purposes, we investigate the performance of a centered detrending moving average analysis with a linear detrending filter and sliding window DFA and show that these methods have better performance than the standard DFA.

  12. Statistical properties of edge plasma turbulence in the Large Helical Device

    NASA Astrophysics Data System (ADS)

    Dewhurst, J. M.; Hnat, B.; Ohno, N.; Dendy, R. O.; Masuzaki, S.; Morisaki, T.; Komori, A.

    2008-09-01

    Ion saturation current (Isat) measurements made by three tips of a Langmuir probe array in the Large Helical Device are analysed for two plasma discharges. Absolute moment analysis is used to quantify properties on different temporal scales of the measured signals, which are bursty and intermittent. Strong coherent modes in some datasets are found to distort this analysis and are consequently removed from the time series by applying bandstop filters. Absolute moment analysis of the filtered data reveals two regions of power-law scaling, with the temporal scale τ ≈ 40 µs separating the two regimes. A comparison is made with similar results from the Mega-Amp Spherical Tokamak. The probability density function is studied and a monotonic relationship between connection length and skewness is found. Conditional averaging is used to characterize the average temporal shape of the largest intermittent bursts.

  13. Tree-space statistics and approximations for large-scale analysis of anatomical trees.

    PubMed

    Feragen, Aasa; Owen, Megan; Petersen, Jens; Wille, Mathilde M W; Thomsen, Laura H; Dirksen, Asger; de Bruijne, Marleen

    2013-01-01

    Statistical analysis of anatomical trees is hard to perform due to differences in the topological structure of the trees. In this paper we define statistical properties of leaf-labeled anatomical trees with geometric edge attributes by considering the anatomical trees as points in the geometric space of leaf-labeled trees. This tree-space is a geodesic metric space where any two trees are connected by a unique shortest path, which corresponds to a tree deformation. However, tree-space is not a manifold, and the usual strategy of performing statistical analysis in a tangent space and projecting onto tree-space is not available. Using tree-space and its shortest paths, a variety of statistical properties, such as mean, principal component, hypothesis testing and linear discriminant analysis can be defined. For some of these properties it is still an open problem how to compute them; others (like the mean) can be computed, but efficient alternatives are helpful in speeding up algorithms that use means iteratively, like hypothesis testing. In this paper, we take advantage of a very large dataset (N = 8016) to obtain computable approximations, under the assumption that the data trees parametrize the relevant parts of tree-space well. Using the developed approximate statistics, we illustrate how the structure and geometry of airway trees vary across a population and show that airway trees with Chronic Obstructive Pulmonary Disease come from a different distribution in tree-space than healthy ones. Software is available from http://image.diku.dk/aasa/software.php.

  14. Nonlinear analysis of pupillary dynamics.

    PubMed

    Onorati, Francesco; Mainardi, Luca Tommaso; Sirca, Fabiola; Russo, Vincenzo; Barbieri, Riccardo

    2016-02-01

    Pupil size reflects autonomic response to different environmental and behavioral stimuli, and its dynamics have been linked to other autonomic correlates such as cardiac and respiratory rhythms. The aim of this study is to assess the nonlinear characteristics of pupil size of 25 normal subjects who participated in a psychophysiological experimental protocol with four experimental conditions, namely “baseline”, “anger”, “joy”, and “sadness”. Nonlinear measures, such as sample entropy, correlation dimension, and largest Lyapunov exponent, were computed on reconstructed signals of spontaneous fluctuations of pupil dilation. Nonparametric statistical tests were performed on surrogate data to verify that the nonlinear measures are an intrinsic characteristic of the signals. We then developed and applied a piecewise linear regression model to detrended fluctuation analysis (DFA). Two joinpoints and three scaling intervals were identified: slope α0, at slow time scales, represents a persistent nonstationary long-range correlation, whereas α1 and α2, at middle and fast time scales, respectively, represent long-range power-law correlations, similarly to DFA applied to heart rate variability signals. Of the computed complexity measures, α0 showed statistically significant differences among experimental conditions (p<0.001). Our results suggest that (a) pupil size at constant light condition is characterized by nonlinear dynamics, (b) three well-defined and distinct long-memory processes exist at different time scales, and (c) autonomic stimulation is partially reflected in nonlinear dynamics. (c) autonomic stimulation is partially reflected in nonlinear dynamics.

  15. Construction of cosmic string induced temperature anisotropy maps with CMBFAST and statistical analysis

    NASA Astrophysics Data System (ADS)

    Simatos, N.; Perivolaropoulos, L.

    2001-01-01

    We use the publicly available code CMBFAST, as modified by Pogosian and Vachaspati, to simulate the effects of wiggly cosmic strings on the cosmic microwave background (CMB). Using the modified CMBFAST code, which takes into account vector modes and models wiggly cosmic strings by the one-scale model, we go beyond the angular power spectrum to construct CMB temperature maps with a resolution of a few degrees. The statistics of these maps are then studied using conventional and recently proposed statistical tests optimized for the detection of hidden temperature discontinuities induced by the Gott-Kaiser-Stebbins effect. We show, however, that these realistic maps cannot be distinguished in a statistically significant way from purely Gaussian maps with an identical power spectrum.

  16. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    NASA Astrophysics Data System (ADS)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  17. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  18. IRIS COLOUR CLASSIFICATION SCALES – THEN AND NOW

    PubMed Central

    Grigore, Mariana; Avram, Alina

    2015-01-01

    Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual’s eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale. PMID:27373112

  19. IRIS COLOUR CLASSIFICATION SCALES--THEN AND NOW.

    PubMed

    Grigore, Mariana; Avram, Alina

    2015-01-01

    Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual's eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale.

  20. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  1. Nonlinear dynamics of the cellular-automaton ``game of Life''

    NASA Astrophysics Data System (ADS)

    Garcia, J. B. C.; Gomes, M. A. F.; Jyh, T. I.; Ren, T. I.; Sales, T. R. M.

    1993-11-01

    A statistical analysis of the ``game of Life'' due to Conway [Berlekamp, Conway, and Guy, Winning Ways for Your Mathematical Plays (Academic, New York, 1982), Vol. 2] is reported. The results are based on extensive computer simulations starting with uncorrelated distributions of live sites at t=0. The number n(s,t) of clusters of s live sites at time t, the mean cluster size s¯(t), and the diversity of sizes among other statistical functions are obtained. The dependence of the statistical functions with the initial density of live sites is examined. Several scaling relations as well as static and dynamic critical exponents are found.

  2. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  3. Excerpts from selected LANDSAT 1 final reports in geology

    NASA Technical Reports Server (NTRS)

    Short, N. M.; Smith, A.; Baker, R.

    1976-01-01

    The standard formats for the summaries of selected LANDSAT geological data are presented as checklists. These include: (1) value of LANDSAT data to geology, (2) geologic benefits, (3) follow up studies, (4) cost benefits, (5) optimistic working scales, (6) statistical analysis, and (7) enhancement effects.

  4. Thinking big: linking rivers to landscapes

    Treesearch

    Joan O’Callaghan; Ashley E. Steel; Kelly M. Burnett

    2012-01-01

    Exploring relationships between landscape characteristics and rivers is an emerging field, enabled by the proliferation of satellite date, advances in statistical analysis, and increased emphasis on large-scale monitoring. Landscapes features such as road networks, underlying geology, and human developments, determine the characteristics of the rivers flowing through...

  5. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  6. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  7. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images.

    PubMed

    Barbosa, Daniel C; Roupar, Dalila B; Ramos, Jaime C; Tavares, Adriano C; Lima, Carlos S

    2012-01-11

    Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

  8. Statistical physics approaches to financial fluctuations

    NASA Astrophysics Data System (ADS)

    Wang, Fengzhong

    2009-12-01

    Complex systems attract many researchers from various scientific fields. Financial markets are one of these widely studied complex systems. Statistical physics, which was originally developed to study large systems, provides novel ideas and powerful methods to analyze financial markets. The study of financial fluctuations characterizes market behavior, and helps to better understand the underlying market mechanism. Our study focuses on volatility, a fundamental quantity to characterize financial fluctuations. We examine equity data of the entire U.S. stock market during 2001 and 2002. To analyze the volatility time series, we develop a new approach, called return interval analysis, which examines the time intervals between two successive volatilities exceeding a given value threshold. We find that the return interval distribution displays scaling over a wide range of thresholds. This scaling is valid for a range of time windows, from one minute up to one day. Moreover, our results are similar for commodities, interest rates, currencies, and for stocks of different countries. Further analysis shows some systematic deviations from a scaling law, which we can attribute to nonlinear correlations in the volatility time series. We also find a memory effect in return intervals for different time scales, which is related to the long-term correlations in the volatility. To further characterize the mechanism of price movement, we simulate the volatility time series using two different models, fractionally integrated generalized autoregressive conditional heteroscedasticity (FIGARCH) and fractional Brownian motion (fBm), and test these models with the return interval analysis. We find that both models can mimic time memory but only fBm shows scaling in the return interval distribution. In addition, we examine the volatility of daily opening to closing and of closing to opening. We find that each volatility distribution has a power law tail. Using the detrended fluctuation analysis (DFA) method, we show long-term auto-correlations in these volatility time series. We also analyze return, the actual price changes of stocks, and find that the returns over the two sessions are often anti-correlated.

  9. An instrument to assess the statistical intensity of medical research papers.

    PubMed

    Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu

    2017-01-01

    There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  10. The General Assessment of Personality Disorder (GAPD): factor structure, incremental validity of self-pathology, and relations to DSM-IV personality disorders.

    PubMed

    Hentschel, Annett G; Livesley, W John

    2013-01-01

    Recent developments in the classification of personality disorder, especially moves toward more dimensional systems, create the need to assess general personality disorder apart from individual differences in personality pathology. The General Assessment of Personality Disorder (GAPD) is a self-report questionnaire designed to evaluate general personality disorder. The measure evaluates 2 major components of disordered personality: self or identity problems and interpersonal dysfunction. This study explores whether there is a single factor reflecting general personality pathology as proposed by the Diagnostic and Statistical Manual of Mental Disorders (5th ed.), whether self-pathology has incremental validity over interpersonal pathology as measured by GAPD, and whether GAPD scales relate significantly to Diagnostic and Statistical Manual of Mental Disorders (4th ed. [DSM-IV]) personality disorders. Based on responses from a German psychiatric sample of 149 participants, parallel analysis yielded a 1-factor model. Self Pathology scales of the GAPD increased the predictive validity of the Interpersonal Pathology scales of the GAPD. The GAPD scales showed a moderate to high correlation for 9 of 12 DSM-IV personality disorders.

  11. A review of downscaling procedures - a contribution to the research on climate change impacts at city scale

    NASA Astrophysics Data System (ADS)

    Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan

    2016-04-01

    Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.

  12. The Lévy flight foraging hypothesis: forgetting about memory may lead to false verification of Brownian motion.

    PubMed

    Gautestad, Arild O; Mysterud, Atle

    2013-01-01

    The Lévy flight foraging hypothesis predicts a transition from scale-free Lévy walk (LW) to scale-specific Brownian motion (BM) as an animal moves from resource-poor towards resource-rich environment. However, the LW-BM continuum implies a premise of memory-less search, which contradicts the cognitive capacity of vertebrates. We describe methods to test if apparent support for LW-BM transitions may rather be a statistical artifact from movement under varying intensity of site fidelity. A higher frequency of returns to previously visited patches (stronger site fidelity) may erroneously be interpreted as a switch from LW towards BM. Simulations of scale-free, memory-enhanced space use illustrate how the ratio between return events and scale-free exploratory movement translates to varying strength of site fidelity. An expanded analysis of GPS data of 18 female red deer, Cervus elaphus, strengthens previous empirical support of memory-enhanced and scale-free space use in a northern forest ecosystem. A statistical mechanical model architecture that describes foraging under environment-dependent variation of site fidelity may allow for higher realism of optimal search models and movement ecology in general, in particular for vertebrates with high cognitive capacity.

  13. Development and validation of 26-item dysfunctional attitude scale.

    PubMed

    Ebrahimi, Amrollah; Samouei, Rahele; Mousavii, Sayyed Ghafour; Bornamanesh, Ali Reza

    2013-06-01

    Dysfunctional Attitude Scale is one of the most common instruments used to assess cognitive vulnerability. This study aimed to develop and validate a short form of Dysfunctional Attitude Scale appropriate for an Iranian clinical population. Participants were 160 psychiatric patients from medical centers affiliated with Isfahan Medical University, as well as 160 non-patients. Research instruments were clinical interviews based on the Diagnostic and Statistical Manual-IV-TR, Dysfunctional Attitude Scale and General Heath Questionnaire (GHQ-28). Data was analyzed using multicorrelation calculations and factor analysis. Based on the results of factor analysis and item-total correlation, 14 items were judged candidates for omission. Analysis of the 26-item Dysfunctional Attitude Scale (DAS-26) revealed a Cronbach's alpha of 0.92. Evidence for the concurrent criterion validity was obtained through calculating the correlation between the Dysfunctional Attitude Scale and psychiatric diagnosis (r = 0.55), GHQ -28 (r = 0.56) and somatization, anxiety, social dysfunction, and depression subscales (0.45,0.53,0.48, and 0.57, respectively). Factor analysis deemed a four-factor structure the best. The factors were labeled as success-perfectionism, need for approval, need for satisfying others, and vulnerability-performance evaluation. The results showed that the Iranian version of the Dysfunctional Attitude Scale (DAS-26) bears satisfactory psychometric properties suggesting that this cognitive instrument is appropriate for use in an Iranian cultural context. Copyright © 2012 Wiley Publishing Asia Pty Ltd.

  14. Environmental analysis using integrated GIS and remotely sensed data - Some research needs and priorities

    NASA Technical Reports Server (NTRS)

    Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.

    1991-01-01

    This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.

  15. Robustness of S1 statistic with Hodges-Lehmann for skewed distributions

    NASA Astrophysics Data System (ADS)

    Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping

    2016-10-01

    Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.

  16. Assessment of Reliable Change Using 95% Credible Intervals for the Differences in Proportions: A Statistical Analysis for Case-Study Methodology.

    PubMed

    Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally

    2015-06-01

    Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.

  17. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. GWAMA: software for genome-wide association meta-analysis.

    PubMed

    Mägi, Reedik; Morris, Andrew P

    2010-05-28

    Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  19. Assessment of Change in Dynamic Psychotherapy

    PubMed Central

    Høglend, Per; Bøgwald, Kjell-Petter; Amlo, Svein; Heyerdahl, Oscar; Sørbye, Øystein; Marble, Alice; Sjaastad, Mary Cosgrove; Bentsen, Håvard

    2000-01-01

    Five scales have been developed to assess changes that are consistent with the therapeutic rationales and procedures of dynamic psychotherapy. Seven raters evaluated 50 patients before and 36 patients again after brief dynamic psychotherapy. A factor analysis indicated that the scales represent a dimension that is discriminable from general symptoms. A summary measure, Dynamic Capacity, was rated with acceptable reliability by a single rater. However, average scores of three raters were needed for good reliability of change ratings. The scales seem to be sufficiently fine-grained to capture statistically and clinically significant changes during brief dynamic psychotherapy. PMID:11069131

  20. A search for anisotrophy in the cosmic microwave background on intermediate angular scales

    NASA Technical Reports Server (NTRS)

    Alsop, D. C.; Cheng, E. S.; Clapp, A. C.; Cottingham, D. A.; Fischer, M. L.; Gundersen, J. O.; Kreysa, E.; Lange, A. E.; Lubin, P. M.; Meinhold, P. R.

    1992-01-01

    The results of a search for anisotropy in the cosmic microwave background on angular scales near 1 deg are presented. Observations were simultaneously performed in bands centered at frequencies of 6, 9, and 12 per cm with a multifrequency bolometric receiver mounted on a balloon-borne telescope. The statistical sensitivity of the data is the highest reported to date at this angular scale, which is of critical importance for understanding the formation of structure in the universe. Signals in excess of random were observed in the data. The experiment, data analysis, and interpretation are described.

  1. Scaling of the velocity fluctuations in turbulent channels up to Reτ=2003

    NASA Astrophysics Data System (ADS)

    Hoyas, Sergio; Jiménez, Javier

    2006-01-01

    A new numerical simulation of a turbulent channel in a large box at Reτ=2003 is described and briefly compared with simulations at lower Reynolds numbers and with experiments. Some of the fluctuation intensities, especially the streamwise velocity, do not scale well in wall units, both near and away from the wall. Spectral analysis traces the near-wall scaling failure to the interaction of the logarithmic layer with the wall. The present statistics can be downloaded from http://torroja.dmt.upm.es/ftp/channels. Further ones will be added to the site as they become available.

  2. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  3. Quantifying the Energy Landscape Statistics in Proteins - a Relaxation Mode Analysis

    NASA Astrophysics Data System (ADS)

    Cai, Zhikun; Zhang, Yang

    Energy landscape, the hypersurface in the configurational space, has been a useful concept in describing complex processes that occur over a very long time scale, such as the multistep slow relaxations of supercooled liquids and folding of polypeptide chains into structured proteins. Despite extensive simulation studies, its experimental characterization still remains a challenge. To address this challenge, we developed a relaxation mode analysis (RMA) for liquids under a framework analogous to the normal mode analysis for solids. Using RMA, important statistics of the activation barriers of the energy landscape becomes accessible from experimentally measurable two-point correlation functions, e.g. using quasi-elastic and inelastic scattering experiments. We observed a prominent coarsening effect of the energy landscape. The results were further confirmed by direct sampling of the energy landscape using a metadynamics-like adaptive autonomous basin climbing computation. We first demonstrate RMA in a supercooled liquid when dynamical cooperativity emerges in the landscape-influenced regime. Then we show this framework reveals encouraging energy landscape statistics when applied to proteins.

  4. PRANAS: A New Platform for Retinal Analysis and Simulation.

    PubMed

    Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry

    2017-01-01

    The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  5. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  6. Transitions between superstatistical regimes: Validity, breakdown and applications

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan; Lavička, Hynek; Prokš, Martin; Svoboda, Václav; Beck, Christian

    2018-03-01

    Superstatistics is a widely employed tool of non-equilibrium statistical physics which plays an important rôle in analysis of hierarchical complex dynamical systems. Yet, its "canonical" formulation in terms of a single nuisance parameter is often too restrictive when applied to complex empirical data. Here we show that a multi-scale generalization of the superstatistics paradigm is more versatile, allowing to address such pertinent issues as transmutation of statistics or inter-scale stochastic behavior. To put some flesh on the bare bones, we provide a numerical evidence for a transition between two superstatistics regimes, by analyzing high-frequency (minute-tick) data for share-price returns of seven selected companies. Salient issues, such as breakdown of superstatistics in fractional diffusion processes or connection with Brownian subordination are also briefly discussed.

  7. Evaluating the assumption of power-law late time scaling of breakthrough curves in highly heterogeneous media

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele

    2017-04-01

    Power-law (PL) distributions are widely adopted to define the late-time scaling of solute breakthrough curves (BTCs) during transport experiments in highly heterogeneous media. However, from a statistical perspective, distinguishing between a PL distribution and another tailed distribution is difficult, particularly when a qualitative assessment based on visual analysis of double-logarithmic plotting is used. This presentation aims to discuss the results from a recent analysis where a suite of statistical tools was applied to evaluate rigorously the scaling of BTCs from experiments that generate tailed distributions typically described as PL at late time. To this end, a set of BTCs from numerical simulations in highly heterogeneous media were generated using a transition probability approach (T-PROGS) coupled to a finite different numerical solver of the flow equation (MODFLOW) and a random walk particle tracking approach for Lagrangian transport (RW3D). The T-PROGS fields assumed randomly distributed hydraulic heterogeneities with long correlation scales creating solute channeling and anomalous transport. For simplicity, transport was simulated as purely advective. This combination of tools generates strongly non-symmetric BTCs visually resembling PL distributions at late time when plotted in double log scales. Unlike other combination of modeling parameters and boundary conditions (e.g. matrix diffusion in fractures), at late time no direct link exists between the mathematical functions describing scaling of these curves and physical parameters controlling transport. The results suggest that the statistical tests fail to describe the majority of curves as PL distributed. Moreover, they suggest that PL or lognormal distributions have the same likelihood to represent parametrically the shape of the tails. It is noticeable that forcing a model to reproduce the tail as PL functions results in a distribution of PL slopes comprised between 1.2 and 4, which are the typical values observed during field experiments. We conclude that care must be taken when defining a BTC late time distribution as a power law function. Even though the estimated scaling factors are found to fall in traditional ranges, the actual distribution controlling the scaling of concentration may different from a power-law function, with direct consequences for instance for the selection of effective parameters in upscaling modeling solutions.

  8. Advancing multiscale structural mapping of the brain through fluorescence imaging and analysis across length scales

    PubMed Central

    Hogstrom, L. J.; Guo, S. M.; Murugadoss, K.; Bathe, M.

    2016-01-01

    Brain function emerges from hierarchical neuronal structure that spans orders of magnitude in length scale, from the nanometre-scale organization of synaptic proteins to the macroscopic wiring of neuronal circuits. Because the synaptic electrochemical signal transmission that drives brain function ultimately relies on the organization of neuronal circuits, understanding brain function requires an understanding of the principles that determine hierarchical neuronal structure in living or intact organisms. Recent advances in fluorescence imaging now enable quantitative characterization of neuronal structure across length scales, ranging from single-molecule localization using super-resolution imaging to whole-brain imaging using light-sheet microscopy on cleared samples. These tools, together with correlative electron microscopy and magnetic resonance imaging at the nanoscopic and macroscopic scales, respectively, now facilitate our ability to probe brain structure across its full range of length scales with cellular and molecular specificity. As these imaging datasets become increasingly accessible to researchers, novel statistical and computational frameworks will play an increasing role in efforts to relate hierarchical brain structure to its function. In this perspective, we discuss several prominent experimental advances that are ushering in a new era of quantitative fluorescence-based imaging in neuroscience along with novel computational and statistical strategies that are helping to distil our understanding of complex brain structure. PMID:26855758

  9. Identifying currents in the gene pool for bacterial populations using an integrative approach.

    PubMed

    Tang, Jing; Hanage, William P; Fraser, Christophe; Corander, Jukka

    2009-08-01

    The evolution of bacterial populations has recently become considerably better understood due to large-scale sequencing of population samples. It has become clear that DNA sequences from a multitude of genes, as well as a broad sample coverage of a target population, are needed to obtain a relatively unbiased view of its genetic structure and the patterns of ancestry connected to the strains. However, the traditional statistical methods for evolutionary inference, such as phylogenetic analysis, are associated with several difficulties under such an extensive sampling scenario, in particular when a considerable amount of recombination is anticipated to have taken place. To meet the needs of large-scale analyses of population structure for bacteria, we introduce here several statistical tools for the detection and representation of recombination between populations. Also, we introduce a model-based description of the shape of a population in sequence space, in terms of its molecular variability and affinity towards other populations. Extensive real data from the genus Neisseria are utilized to demonstrate the potential of an approach where these population genetic tools are combined with an phylogenetic analysis. The statistical tools introduced here are freely available in BAPS 5.2 software, which can be downloaded from http://web.abo.fi/fak/mnf/mate/jc/software/baps.html.

  10. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  11. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  12. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  13. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.

  14. Delay, change and bifurcation of the immunofluorescence distribution attractors in health statuses diagnostics and in medical treatment

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.; Filatov, Michael V.

    2008-07-01

    Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  15. Effects of spatial variability and scale on areal -average evapotranspiration

    NASA Technical Reports Server (NTRS)

    Famiglietti, J. S.; Wood, Eric F.

    1993-01-01

    This paper explores the effect of spatial variability and scale on areally-averaged evapotranspiration. A spatially-distributed water and energy balance model is employed to determine the effect of explicit patterns of model parameters and atmospheric forcing on modeled areally-averaged evapotranspiration over a range of increasing spatial scales. The analysis is performed from the local scale to the catchment scale. The study area is King's Creek catchment, an 11.7 sq km watershed located on the native tallgrass prairie of Kansas. The dominant controls on the scaling behavior of catchment-average evapotranspiration are investigated by simulation, as is the existence of a threshold scale for evapotranspiration modeling, with implications for explicit versus statistical representation of important process controls. It appears that some of our findings are fairly general, and will therefore provide a framework for understanding the scaling behavior of areally-averaged evapotranspiration at the catchment and larger scales.

  16. Feminist identity as a predictor of eating disorder diagnostic status.

    PubMed

    Green, Melinda A; Scott, Norman A; Riopel, Cori M; Skaggs, Anna K

    2008-06-01

    Passive Acceptance (PA) and Active Commitment (AC) subscales of the Feminist Identity Development Scale (FIDS) were examined as predictors of eating disorder diagnostic status as assessed by the Questionnaire for Eating Disorder Diagnoses (Q-EDD). Results of a hierarchical regression analysis revealed PA and AC scores were not statistically significant predictors of ED diagnostic status after controlling for diagnostic subtype. Results of a multiple regression analysis revealed FIDS as a statistically significant predictor of ED diagnostic status when failing to control for ED diagnostic subtype. Discrepancies suggest ED diagnostic subtype may serve as a moderator variable in the relationship between ED diagnostic status and FIDS. (c) 2008 Wiley Periodicals, Inc.

  17. The social construction of "evidence-based'' drug prevention programs: a reanalysis of data from the Drug Abuse Resistance Education (DARE) program.

    PubMed

    Gorman, Dennis M; Huber, J Charles

    2009-08-01

    This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.

  18. The Prompt-afterglow Connection in Gamma-ray Bursts: a Comprehensive Statistical Analysis of Swift X-ray Light-curves

    NASA Technical Reports Server (NTRS)

    Margutti, R.; Zaninoni, E.; Bernardini, M. G.; Chincarini, G.; Pasotti, F.; Guidorzi, C.; Angelini, Lorella; Burrows, D. N.; Capalbi, M.; Evans, P. A.; hide

    2012-01-01

    We present a comprehensive statistical analysis of Swift X-ray light-curves of Gamma- Ray Bursts (GRBs) collecting data from more than 650 GRBs discovered by Swift and other facilities. The unprecedented sample size allows us to constrain the rest-frame X-ray properties of GRBs from a statistical perspective, with particular reference to intrinsic time scales and the energetics of the different light-curve phases in a common rest-frame 0.3-30 keV energy band. Temporal variability episodes are also studied and their properties constrained. Two fundamental questions drive this effort: i) Does the X-ray emission retain any kind of "memory" of the prompt ?-ray phase? ii) Where is the dividing line between long and short GRB X-ray properties? We show that short GRBs decay faster, are less luminous and less energetic than long GRBs in the X-rays, but are interestingly characterized by similar intrinsic absorption. We furthermore reveal the existence of a number of statistically significant relations that link the X-ray to prompt ?-ray parameters in long GRBs; short GRBs are outliers of the majority of these 2-parameter relations. However and more importantly, we report on the existence of a universal 3-parameter scaling that links the X-ray and the ?-ray energy to the prompt spectral peak energy of both long and short GRBs: E(sub X,iso)? E(sup 1.00+/-0.06)(sub ?,iso) /E(sup 0.60+/-0.10)(sub pk).

  19. The prompt-afterglow connection in gamma-ray bursts: a comprehensive statistical analysis of Swift X-ray light curves

    NASA Astrophysics Data System (ADS)

    Margutti, R.; Zaninoni, E.; Bernardini, M. G.; Chincarini, G.; Pasotti, F.; Guidorzi, C.; Angelini, L.; Burrows, D. N.; Capalbi, M.; Evans, P. A.; Gehrels, N.; Kennea, J.; Mangano, V.; Moretti, A.; Nousek, J.; Osborne, J. P.; Page, K. L.; Perri, M.; Racusin, J.; Romano, P.; Sbarufatti, B.; Stafford, S.; Stamatikos, M.

    2013-01-01

    We present a comprehensive statistical analysis of Swift X-ray light curves of gamma-ray bursts (GRBs) collecting data from more than 650 GRBs discovered by Swift and other facilities. The unprecedented sample size allows us to constrain the rest-frame X-ray properties of GRBs from a statistical perspective, with particular reference to intrinsic time-scales and the energetics of the different light-curve phases in a common rest-frame 0.3-30 keV energy band. Temporal variability episodes are also studied and their properties constrained. Two fundamental questions drive this effort: (i) Does the X-ray emission retain any kind of `memory' of the prompt γ-ray phase? (ii) Where is the dividing line between long and short GRB X-ray properties? We show that short GRBs decay faster, are less luminous and less energetic than long GRBs in the X-rays, but are interestingly characterized by similar intrinsic absorption. We furthermore reveal the existence of a number of statistically significant relations that link the X-ray to prompt γ-ray parameters in long GRBs; short GRBs are outliers of the majority of these two-parameter relations. However and more importantly, we report on the existence of a universal three-parameter scaling that links the X-ray and the γ-ray energy to the prompt spectral peak energy of both long and short GRBs: EX, iso∝E1.00 ± 0.06γ, iso/E0.60 ± 0.10pk.

  20. A Novel Genome-Information Content-Based Statistic for Genome-Wide Association Analysis Designed for Next-Generation Sequencing Data

    PubMed Central

    Luo, Li; Zhu, Yun

    2012-01-01

    Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812

  1. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    PubMed

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  2. Variations of attractors and wavelet spectra of the immunofluorescence distributions for women in the pregnant period

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.

    2008-07-01

    Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  3. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    PubMed

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  4. Flow topologies and turbulence scales in a jet-in-cross-flow

    DOE PAGES

    Oefelein, Joseph C.; Ruiz, Anthony M.; Lacaze, Guilhem

    2015-04-03

    This study presents a detailed analysis of the flow topologies and turbulence scales in the jet-in-cross-flow experiment of [Su and Mungal JFM 2004]. The analysis is performed using the Large Eddy Simulation (LES) technique with a highly resolved grid and time-step and well controlled boundary conditions. This enables quantitative agreement with the first and second moments of turbulence statistics measured in the experiment. LES is used to perform the analysis since experimental measurements of time-resolved 3D fields are still in their infancy and because sampling periods are generally limited with direct numerical simulation. A major focal point is the comprehensivemore » characterization of the turbulence scales and their evolution. Time-resolved probes are used with long sampling periods to obtain maps of the integral scales, Taylor microscales, and turbulent kinetic energy spectra. Scalar-fluctuation scales are also quantified. In the near-field, coherent structures are clearly identified, both in physical and spectral space. Along the jet centerline, turbulence scales grow according to a classical one-third power law. However, the derived maps of turbulence scales reveal strong inhomogeneities in the flow. From the modeling perspective, these insights are useful to design optimized grids and improve numerical predictions in similar configurations.« less

  5. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    ERIC Educational Resources Information Center

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  6. A Large-Scale Analysis of Variance in Written Language

    ERIC Educational Resources Information Center

    Johns, Brendan T.; Jamieson, Randall K.

    2018-01-01

    The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers,…

  7. A multi-scale analysis of landscape statistics

    Treesearch

    Douglas H. Cain; Kurt H. Riitters; Kenneth Orvis

    1997-01-01

    It is now feasible to monitor some aspects of landscape ecological condition nationwide using remotely- sensed imagery and indicators of land cover pattern. Previous research showed redundancies among many reported pattern indicators and identified six unique dimensions of land cover pattern. This study tested the stability of those dimensions and representative...

  8. Comment on “Two statistics for evaluating parameter identifiability and error reduction” by John Doherty and Randall J. Hunt

    USGS Publications Warehouse

    Hill, Mary C.

    2010-01-01

    Doherty and Hunt (2009) present important ideas for first-order-second moment sensitivity analysis, but five issues are discussed in this comment. First, considering the composite-scaled sensitivity (CSS) jointly with parameter correlation coefficients (PCC) in a CSS/PCC analysis addresses the difficulties with CSS mentioned in the introduction. Second, their new parameter identifiability statistic actually is likely to do a poor job of parameter identifiability in common situations. The statistic instead performs the very useful role of showing how model parameters are included in the estimated singular value decomposition (SVD) parameters. Its close relation to CSS is shown. Third, the idea from p. 125 that a suitable truncation point for SVD parameters can be identified using the prediction variance is challenged using results from Moore and Doherty (2005). Fourth, the relative error reduction statistic of Doherty and Hunt is shown to belong to an emerging set of statistics here named perturbed calculated variance statistics. Finally, the perturbed calculated variance statistics OPR and PPR mentioned on p. 121 are shown to explicitly include the parameter null-space component of uncertainty. Indeed, OPR and PPR results that account for null-space uncertainty have appeared in the literature since 2000.

  9. Calibration sets and the accuracy of vibrational scaling factors: A case study with the X3LYP hybrid functional

    NASA Astrophysics Data System (ADS)

    Teixeira, Filipe; Melo, André; Cordeiro, M. Natália D. S.

    2010-09-01

    A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.

  10. Calibration sets and the accuracy of vibrational scaling factors: a case study with the X3LYP hybrid functional.

    PubMed

    Teixeira, Filipe; Melo, André; Cordeiro, M Natália D S

    2010-09-21

    A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.

  11. Development and Psychometric Assessment of the Healthcare Provider Cultural Competence Instrument

    PubMed Central

    Schwarz, Joshua L.; Witte, Raymond; Sellers, Sherrill L.; Luzadis, Rebecca A.; Weiner, Judith L.; Domingo-Snyder, Eloiza; Page, James E.

    2015-01-01

    This study presents the measurement properties of 5 scales used in the Healthcare Provider Cultural Competence Instrument (HPCCI). The HPCCI measures a health care provider’s cultural competence along 5 primary dimensions: (1) awareness/sensitivity, (2) behaviors, (3) patient-centered communication, (4) practice orientation, and (5) self-assessment. Exploratory factor analysis demonstrated that the 5 scales were distinct, and within each scale items loaded as expected. Reliability statistics indicated a high level of internal consistency within each scale. The results indicate that the HPCCI effectively measures the cultural competence of health care providers and can provide useful professional feedback for practitioners and organizations seeking to increase a practitioner’s cultural competence. PMID:25911617

  12. Statistical analysis of Hasegawa-Wakatani turbulence

    NASA Astrophysics Data System (ADS)

    Anderson, Johan; Hnat, Bogdan

    2017-06-01

    Resistive drift wave turbulence is a multipurpose paradigm that can be used to understand transport at the edge of fusion devices. The Hasegawa-Wakatani model captures the essential physics of drift turbulence while retaining the simplicity needed to gain a qualitative understanding of this process. We provide a theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent events in Hasegawa-Wakatani turbulence with enforced equipartition of energy in large scale zonal flows, and small scale drift turbulence. We find that for a wide range of adiabatic index values, the stochastic component representing the small scale turbulent eddies of the flow, obtained from the autoregressive integrated moving average model, exhibits super-diffusive statistics, consistent with intermittent transport. The PDFs of large events (above one standard deviation) are well approximated by the Laplace distribution, while small events often exhibit a Gaussian character. Furthermore, there exists a strong influence of zonal flows, for example, via shearing and then viscous dissipation maintaining a sub-diffusive character of the fluxes.

  13. Statistical characteristics of austral summer cyclones in Southern Ocean

    NASA Astrophysics Data System (ADS)

    Liu, Na; Fu, Gang; Kuo, Ying-Hwa

    2012-06-01

    Characteristics of cyclones and explosively developing cyclones (or `bombs') over the Southern Ocean in austral summer (December, January and February) from 2004 to 2008 are analyzed by using the Final Analysis (FNL) data produced by the National Centers for Environmental Prediction (NCEP) of the United States. Statistical results show that both cyclones and explosively developing cyclones frequently develop in January, and most of them occur within the latitudinal zone between 55°S and 70°S. These cyclones gradually approach the Antarctic Continent from December to February. Generally cyclones and bombs move east-southeastward with some exceptions of northeastward movement. The lifetime of cyclones is around 2-6 d, and the horizontal scale is about 1000 km. Explosive cyclones have the lifetime of about 1 week with the horizontal scale reaching up to 3000 km. Compared with cyclones developed in the Northern Hemisphere, cyclones over the southern ocean have much higher occurrence frequency, lower central pressure and larger horizontal scale, which may be caused by the unique geographical features of the Southern Hemisphere.

  14. Short forms of the Texas Social Behavior Inventory /TSBI/, an objective measure of self-esteem

    NASA Technical Reports Server (NTRS)

    Helmreich, R.; Stapp, J.

    1974-01-01

    Two short (16 item) forms of the Helmreich, Stapp, and Ervin (1974) Texas Social Behavior Inventory, a validated, objective measure of self-esteem or social competence are presented. Normative data and other statistics are described for males and females. Correlations between each short form and long (32-item) scale were .97. Factor analysis and part-whole correlations verified the similarity of the two forms. The utility of the scale in research is described.

  15. SQC: secure quality control for meta-analysis of genome-wide association studies.

    PubMed

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. The Impact of Arts Activity on Nursing Staff Well-Being: An Intervention in the Workplace

    PubMed Central

    Karpavičiūtė, Simona; Macijauskienė, Jūratė

    2016-01-01

    Over 59 million workers are employed in the healthcare sector globally, with a daily risk of being exposed to a complex variety of health and safety hazards. The purpose of this study was to investigate the impact of arts activity on the well-being of nursing staff. During October–December 2014, 115 nursing staff working in a hospital, took part in this study, which lasted for 10 weeks. The intervention group (n = 56) took part in silk painting activities once a week. Data was collected using socio-demographic questions, the Warwick-Edinburgh Mental Well-Being Scale, Short Form—36 Health Survey questionnaire, Reeder stress scale, and Multidimensional fatigue inventory (before and after art activities in both groups). Statistical data analysis included descriptive statistics (frequency, percentage, mean, standard deviation), non-parametric statistics analysis (Man Whitney U Test; Wilcoxon signed—ranks test), Fisher’s exact test and reliability analysis (Cronbach’s Alpha). The level of significance was set at p ≤ 0.05. In the intervention group, there was a tendency for participation in arts activity having a positive impact on their general health and mental well-being, reducing stress and fatigue, awaking creativity and increasing a sense of community at work. The control group did not show any improvements. Of the intervention group 93% reported enjoyment, with 75% aspiring to continue arts activity in the future. This research suggests that arts activity, as a workplace intervention, can be used to promote nursing staff well-being at work. PMID:27104550

  17. A statistically derived index for classifying East Coast fever reactions in cattle challenged with Theileria parva under experimental conditions.

    PubMed

    Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J

    2000-04-01

    A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.

  18. Multivariate statistical characterization of charged and uncharged domain walls in multiferroic hexagonal YMnO3 single crystal visualized by a spherical aberration-corrected STEM.

    PubMed

    Matsumoto, Takao; Ishikawa, Ryo; Tohei, Tetsuya; Kimura, Hideo; Yao, Qiwen; Zhao, Hongyang; Wang, Xiaolin; Chen, Dapeng; Cheng, Zhenxiang; Shibata, Naoya; Ikuhara, Yuichi

    2013-10-09

    A state-of-the-art spherical aberration-corrected STEM was fully utilized to directly visualize the multiferroic domain structure in a hexagonal YMnO3 single crystal at atomic scale. With the aid of multivariate statistical analysis (MSA), we obtained unbiased and quantitative maps of ferroelectric domain structures with atomic resolution. Such a statistical image analysis of the transition region between opposite polarizations has confirmed atomically sharp transitions of ferroelectric polarization both in antiparallel (uncharged) and tail-to-tail 180° (charged) domain boundaries. Through the analysis, a correlated subatomic image shift of Mn-O layers with that of Y layers, exhibiting a double-arc shape of reversed curvatures, have been elucidated. The amount of image shift in Mn-O layers along the c-axis is statistically significant as small as 0.016 nm, roughly one-third of the evident image shift of 0.048 nm in Y layers. Interestingly, a careful analysis has shown that such a subatomic image shift in Mn-O layers vanishes at the tail-to-tail 180° domain boundaries. Furthermore, taking advantage of the annular bright field (ABF) imaging technique combined with MSA, the tilting of MnO5 bipyramids, the very core mechanism of multiferroicity of the material, is evaluated.

  19. Measuring trust in nurses - Psychometric properties of the Trust in Nurses Scale in four countries.

    PubMed

    Stolt, Minna; Charalambous, Andreas; Radwin, Laurel; Adam, Christina; Katajisto, Jouko; Lemonidou, Chryssoula; Patiraki, Elisabeth; Sjövall, Katarina; Suhonen, Riitta

    2016-12-01

    The purpose of this study was to examine psychometric properties of three translated versions of the Trust in Nurses Scale (TNS) and cancer patients' perceptions of trust in nurses in a sample of cancer patients from four European countries. A cross-sectional, cross-cultural, multi-site survey design was used. The data were collected with the Trust in Nurses Scale from patients with different types of malignancies in 17 units within five clinical sites (n = 599) between 09/2012 and 06/2014. Data were analyzed using descriptive and inferential statistics, multivariate methods and psychometrics using exploratory factor analysis, Cronbach's alpha coefficients, item analysis and Rasch analysis. The psychometric properties of the data were consistent in all countries. Within the exploratory factor analysis the principal component analysis supported the one component structure (unidimensionality) of the TNS. The internal consistency reliability was acceptable. The Rasch analysis supported the unidimensionality of the TNS cross-culturally. All items of the TNS demonstrated acceptable goodness-of-fit to the Rasch model. Cancer patients trusted nurses to a great extent although between-country differences were found. The Trust in Nurses Scale proved to be a valid and reliable tool for measuring patients' trust in nurses in oncological settings in international contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Periodontal disease and carotid atherosclerosis: A meta-analysis of 17,330 participants.

    PubMed

    Zeng, Xian-Tao; Leng, Wei-Dong; Lam, Yat-Yin; Yan, Bryan P; Wei, Xue-Mei; Weng, Hong; Kwong, Joey S W

    2016-01-15

    The association between periodontal disease and carotid atherosclerosis has been evaluated primarily in single-center studies, and whether periodontal disease is an independent risk factor of carotid atherosclerosis remains uncertain. This meta-analysis aimed to evaluate the association between periodontal disease and carotid atherosclerosis. We searched PubMed and Embase for relevant observational studies up to February 20, 2015. Two authors independently extracted data from included studies, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was assessed by the chi-squared test (P<0.1 for statistical significance) and quantified by the I(2) statistic. Data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software. Fifteen observational studies involving 17,330 participants were included in the meta-analysis. The overall pooled result showed that periodontal disease was associated with carotid atherosclerosis (OR: 1.27, 95% CI: 1.14-1.41; P<0.001) but statistical heterogeneity was substantial (I(2)=78.90%). Subgroup analysis of adjusted smoking and diabetes mellitus showed borderline significance (OR: 1.08; 95% CI: 1.00-1.18; P=0.05). Sensitivity and cumulative analyses both indicated that our results were robust. Findings of our meta-analysis indicated that the presence of periodontal disease was associated with carotid atherosclerosis; however, further large-scale, well-conducted clinical studies are needed to explore the precise risk of developing carotid atherosclerosis in patients with periodontal disease. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Universal fractal scaling in stream chemistry and its implications for solute transport and water quality trend detection

    PubMed Central

    Kirchner, James W.; Neal, Colin

    2013-01-01

    The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1–2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non–self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends—much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems. PMID:23842090

  2. Statistical analyses support power law distributions found in neuronal avalanches.

    PubMed

    Klaus, Andreas; Yu, Shan; Plenz, Dietmar

    2011-01-01

    The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  3. Universal fractal scaling in stream chemistry and its implications for solute transport and water quality trend detection

    NASA Astrophysics Data System (ADS)

    Kirchner, James W.; Neal, Colin

    2013-07-01

    The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1-2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non-self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends-much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems.

  4. Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments

    DTIC Science & Technology

    2015-09-30

    statistical inference methodologies for ocean- acoustic problems by investigating and applying statistical methods to data collected from scale-model...to begin planning experiments for statistical inference applications. APPROACH In the ocean acoustics community over the past two decades...solutions for waveguide parameters. With the introduction of statistical inference to the field of ocean acoustics came the desire to interpret marginal

  5. Psychometric properties of the Trust in Physician Scale in Tamil Nadu, India.

    PubMed

    Kalsingh, Maria Jusler; Veliah, Geetha; Gopichandran, Vijayaprasad

    2017-01-01

    Trust in health care is of high intrinsic value. It also leads to positive outcomes such as better treatment adherence and disclosure of sensitive information. Therefore, there is a need to measure trust in health care objectively. To assess the psychometric properties of the Trust in Physician Scale in Tamil Nadu, India. The study was conducted in a private tertiary hospital setting in Tamil Nadu by a cross-sectional survey design. The Trust in Physician Scale and General Trust Scale were administered to 288 participants in the waiting area of a tertiary care hospital in Tamil Nadu. Descriptive statistics, exploratory factor analysis, and Cronbach's alpha statistics were used to assess the validity and reliability of the scale. The respondents were predominantly men from rural areas, older than 35 years of age, and with lesser than 8 years of schooling. The questionnaire had acceptable internal consistency with Cronbach's alpha of 0.707 (95% confidence interval 0.654-0.755). Exploratory factor analysis divided the questionnaire into four domains. Seven items loaded into factor 1 which explained dependability and competence of the physician, two items loaded on factor 2, and one each in factors 3 and 4. The latter four items had very low item to total correlations and hence did not contribute much to the questionnaire. The Trust in Physician questionnaire needs to be modified to accurately measure the domains of trust in the context of the study area. More qualitative studies are required to understand the domains of trust in this cultural and social context.

  6. Screening for depressive disorders using the Mood and Anxiety Symptoms Questionnaire Anhedonic Depression Scale: a receiver-operating characteristic analysis.

    PubMed

    Bredemeier, Keith; Spielberg, Jeffery M; Silton, Rebecca Levin; Berenbaum, Howard; Heller, Wendy; Miller, Gregory A

    2010-09-01

    The present study examined the utility of the anhedonic depression scale from the Mood and Anxiety Symptoms Questionnaire (MASQ-AD scale) as a way to screen for depressive disorders. Using receiver-operating characteristic analysis, we examined the sensitivity and specificity of the full 22-item MASQ-AD scale, as well as the 8- and 14-item subscales, in relation to both current and lifetime Diagnostic and Statistical Manual of Mental Disorders (4th ed.) depressive disorder diagnoses in two nonpatient samples. As a means of comparison, the sensitivity and specificity of a measure of a relevant personality dimension, Neuroticism, was also examined. Results from both samples support the clinical utility of the MASQ-AD scale as a means of screening for depressive disorders. Findings were strongest for the MASQ-AD 8-item subscale and when predicting current depression status. Furthermore, the MASQ-AD 8-item subscale outperformed the Neuroticism measure under certain conditions. The overall usefulness of the MASQ-AD scale as a screening device is discussed, as are possible cutoff scores for use in research.

  7. A fast boosting-based screening method for large-scale association study in complex traits with genetic heterogeneity.

    PubMed

    Wang, Lu-Yong; Fasulo, D

    2006-01-01

    Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.

  8. Percolation Analysis as a Tool to Describe the Topology of the Large Scale Structure of the Universe

    NASA Astrophysics Data System (ADS)

    Yess, Capp D.

    1997-09-01

    Percolation analysis is the study of the properties of clusters. In cosmology, it is the statistics of the size and number of clusters. This thesis presents a refinement of percolation analysis and its application to astronomical data. An overview of the standard model of the universe and the development of large scale structure is presented in order to place the study in historical and scientific context. Then using percolation statistics we, for the first time, demonstrate the universal character of a network pattern in the real space, mass distributions resulting from nonlinear gravitational instability of initial Gaussian fluctuations. We also find that the maximum of the number of clusters statistic in the evolved, nonlinear distributions is determined by the effective slope of the power spectrum. Next, we present percolation analyses of Wiener Reconstructions of the IRAS 1.2 Jy Redshift Survey. There are ten reconstructions of galaxy density fields in real space spanning the range β = 0.1 to 1.0, where β=Ω0.6/b,/ Ω is the present dimensionless density and b is the linear bias factor. Our method uses the growth of the largest cluster statistic to characterize the topology of a density field, where Gaussian randomized versions of the reconstructions are used as standards for analysis. For the reconstruction volume of radius, R≈100h-1 Mpc, percolation analysis reveals a slight 'meatball' topology for the real space, galaxy distribution of the IRAS survey. Finally, we employ a percolation technique developed for pointwise distributions to analyze two-dimensional projections of the three northern and three southern slices in the Las Campanas Redshift Survey and then give consideration to further study of the methodology, errors and application of percolation. We track the growth of the largest cluster as a topological indicator to a depth of 400 h-1 Mpc, and report an unambiguous signal, with high signal-to-noise ratio, indicating a network topology which in two dimensions is indicative of a filamentary distribution. It is hoped that one day percolation analysis can characterize the structure of the universe to a degree that will aid theorists in confidently describing the nature of our world.

  9. Wavelet and receiver operating characteristic analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    McCaffery, G.; Griffith, T. M.; Naka, K.; Frennaux, M. P.; Matthai, C. C.

    2002-02-01

    Multiresolution wavelet analysis has been used to study the heart rate variability in two classes of patients with different pathological conditions. The scale dependent measure of Thurner et al. was found to be statistically significant in discriminating patients suffering from hypercardiomyopathy from a control set of normal subjects. We have performed Receiver Operating Characteristc (ROC) analysis and found the ROC area to be a useful measure by which to label the significance of the discrimination, as well as to describe the severity of heart dysfunction.

  10. Morphological Properties of Siloxane-Hydrogel Contact Lens Surfaces.

    PubMed

    Stach, Sebastian; Ţălu, Ştefan; Trabattoni, Silvia; Tavazzi, Silvia; Głuchaczka, Alicja; Siek, Patrycja; Zając, Joanna; Giovanzana, Stefano

    2017-04-01

    The aim of this study was to quantitatively characterize the micromorphology of contact lens (CL) surfaces using atomic force microscopy (AFM) and multifractal analysis. AFM and multifractal analysis were used to characterize the topography of new and worn siloxane-hydrogel CLs made of Filcon V (I FDA group). CL surface roughness was studied by AFM in intermittent-contact mode, in air, on square areas of 25 and 100 μm 2 , by using a Nanoscope V MultiMode (Bruker). Detailed surface characterization of the surface topography was obtained using statistical parameters of 3-D (three-dimensional) surface roughness, in accordance with ISO 25178-2: 2012. Before wear, the surface was found to be characterized by out-of-plane and sharp structures, whilst after a wear of 8 h, two typical morphologies were observed. One morphology (sharp type) has a similar aspect as the unworn CLs and the other morphology (smooth type) is characterized by troughs and bumpy structures. The analysis of the AFM images revealed a multifractal geometry. The generalized dimension D q and the singularity spectrum f(α) provided quantitative values that characterize the local scale properties of CL surface geometry at nanometer scale. Surface statistical parameters deduced by multifractal analysis can be used to assess the CL micromorphology and can be used by manufacturers in developing CLs with improved surface characteristics. These parameters can also be used in understanding the tribological interactions of the back surface of the CL with the corneal surface and the front surface of the CL with the under-surface of the eyelid (friction, wear, and micro-elastohydrodynamic lubrication at a nanometer scale).

  11. Spatiotemporal Drought Analysis and Drought Indices Comparison in India

    NASA Astrophysics Data System (ADS)

    Janardhanan, A.

    2017-12-01

    Droughts and floods are an ever-occurring phenomenon that has been wreaking havoc on humans since the start of time. As droughts are on a very large scale, studying them within a regional context can minimize confounding factors such as climate change. Droughts and floods are extremely erratic and very difficult to predict and therefore necessitate modeling through advanced statistics. The SPI (Standard Precipitation Index) and the SPEI (Standard Precipitation Evapotranspiration Index) are two ways to temporally model drought and flood patterns across each metrological sub basin in India over a variety of different time scales. SPI only accounts for precipitation values, while the SPEI accounts for both precipitation and temperature and is commonly regarded as a more reliable drought index. Using monthly rainfall and temperature data from 1871-2016, these two indices were calculated. The results depict the drought and flood severity index, length of drought, and average SPI or SPEI value for each meteorological sub region in India. A Wilcox Ranksum test was then conducted to determine whether these two indices differed over the long term for drought analysis. The drought return periods were analyzed to determine if the population mean differed between the SPI and SPEI values. Our analysis found no statistical difference between SPI and SPEI with regards to long-term drought analysis. This indicates that temperature is not needed when modeling drought on a long-term time scale and that SPI is just as effective as SPEI, which has the potential to save a lot of time and resources on calculating drought indices.

  12. Intercomparison of textural parameters of intertidal sediments generated by different statistical procedures, and implications for a unifying descriptive nomenclature

    NASA Astrophysics Data System (ADS)

    Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai

    2015-06-01

    Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.

  13. Application of multivariate analysis and mass transfer principles for refinement of a 3-L bioreactor scale-down model--when shake flasks mimic 15,000-L bioreactors better.

    PubMed

    Ahuja, Sanjeev; Jain, Shilpa; Ram, Kripa

    2015-01-01

    Characterization of manufacturing processes is key to understanding the effects of process parameters on process performance and product quality. These studies are generally conducted using small-scale model systems. Because of the importance of the results derived from these studies, the small-scale model should be predictive of large scale. Typically, small-scale bioreactors, which are considered superior to shake flasks in simulating large-scale bioreactors, are used as the scale-down models for characterizing mammalian cell culture processes. In this article, we describe a case study where a cell culture unit operation in bioreactors using one-sided pH control and their satellites (small-scale runs conducted using the same post-inoculation cultures and nutrient feeds) in 3-L bioreactors and shake flasks indicated that shake flasks mimicked the large-scale performance better than 3-L bioreactors. We detail here how multivariate analysis was used to make the pertinent assessment and to generate the hypothesis for refining the existing 3-L scale-down model. Relevant statistical techniques such as principal component analysis, partial least square, orthogonal partial least square, and discriminant analysis were used to identify the outliers and to determine the discriminatory variables responsible for performance differences at different scales. The resulting analysis, in combination with mass transfer principles, led to the hypothesis that observed similarities between 15,000-L and shake flask runs, and differences between 15,000-L and 3-L runs, were due to pCO2 and pH values. This hypothesis was confirmed by changing the aeration strategy at 3-L scale. By reducing the initial sparge rate in 3-L bioreactor, process performance and product quality data moved closer to that of large scale. © 2015 American Institute of Chemical Engineers.

  14. The Attitudes to Ageing Questionnaire: Mokken Scaling Analysis

    PubMed Central

    Shenkin, Susan D.; Watson, Roger; Laidlaw, Ken; Starr, John M.; Deary, Ian J.

    2014-01-01

    Background Hierarchical scales are useful in understanding the structure of underlying latent traits in many questionnaires. The Attitudes to Ageing Questionnaire (AAQ) explored the attitudes to ageing of older people themselves, and originally described three distinct subscales: (1) Psychosocial Loss (2) Physical Change and (3) Psychological Growth. This study aimed to use Mokken analysis, a method of Item Response Theory, to test for hierarchies within the AAQ and to explore how these relate to underlying latent traits. Methods Participants in a longitudinal cohort study, the Lothian Birth Cohort 1936, completed a cross-sectional postal survey. Data from 802 participants were analysed using Mokken Scaling analysis. These results were compared with factor analysis using exploratory structural equation modelling. Results Participants were 51.6% male, mean age 74.0 years (SD 0.28). Three scales were identified from 18 of the 24 items: two weak Mokken scales and one moderate Mokken scale. (1) ‘Vitality’ contained a combination of items from all three previously determined factors of the AAQ, with a hierarchy from physical to psychosocial; (2) ‘Legacy’ contained items exclusively from the Psychological Growth scale, with a hierarchy from individual contributions to passing things on; (3) ‘Exclusion’ contained items from the Psychosocial Loss scale, with a hierarchy from general to specific instances. All of the scales were reliable and statistically significant with ‘Legacy’ showing invariant item ordering. The scales correlate as expected with personality, anxiety and depression. Exploratory SEM mostly confirmed the original factor structure. Conclusions The concurrent use of factor analysis and Mokken scaling provides additional information about the AAQ. The previously-described factor structure is mostly confirmed. Mokken scaling identifies a new factor relating to vitality, and a hierarchy of responses within three separate scales, referring to vitality, legacy and exclusion. This shows what older people themselves consider important regarding their own ageing. PMID:24892302

  15. An open-access CMIP5 pattern library for temperature and precipitation: description and methodology

    NASA Astrophysics Data System (ADS)

    Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben

    2017-05-01

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.

  16. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  17. Reconnection AND Bursty Bulk Flow Associated Turbulence IN THE Earth'S Plasma Sheet

    NASA Astrophysics Data System (ADS)

    Voros, Z.; Nakamura, R.; Baumjohann, W.; Runov, A.; Volwerk, M.; Jankovicova, D.; Balogh, A.; Klecker, B.

    2006-12-01

    Reconnection related fast flows in the Earth's plasma sheet can be associated with several accompanying phenomena, such as magnetic field dipolarization, current sheet thinning and turbulence. Statistical analysis of multi-scale properties of turbulence facilitates to understand the interaction of the plasma flow with the dipolar magnetic field and to recognize the remote or nearby temporal and spatial characteristics of reconnection. The main emphasis of this presentation is on differentiating between the specific statistical features of flow associated fluctuations at different distances from the reconnection site.

  18. Hydrological responses to dynamically and statistically downscaled climate model output

    USGS Publications Warehouse

    Wilby, R.L.; Hay, L.E.; Gutowski, W.J.; Arritt, R.W.; Takle, E.S.; Pan, Z.; Leavesley, G.H.; Clark, M.P.

    2000-01-01

    Daily rainfall and surface temperature series were simulated for the Animas River basin, Colorado using dynamically and statistically downscaled output from the National Center for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis. A distributed hydrological model was then applied to the downscaled data. Relative to raw NCEP output, downscaled climate variables provided more realistic stimulations of basin scale hydrology. However, the results highlight the sensitivity of modeled processes to the choice of downscaling technique, and point to the need for caution when interpreting future hydrological scenarios.

  19. Influence of the Time Scale on the Construction of Financial Networks

    PubMed Central

    Emmert-Streib, Frank; Dehmer, Matthias

    2010-01-01

    Background In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. Methodology/Principal Findings For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Conclusions/Significance Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis. PMID:20949124

  20. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  1. HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.

    PubMed

    Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J

    2016-06-03

    Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .

  2. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline

    PubMed Central

    2013-01-01

    Background As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. Results We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS A : DE genes with non-zero effect sizes in all studies, (2) HS B : DE genes with non-zero effect sizes in one or more studies and (3) HS r : DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. Conclusions The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS A , HS B , and HS r ). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author’s publication website. PMID:24359104

  3. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline.

    PubMed

    Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C

    2013-12-21

    As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.

  4. Toward a Better Understanding of the Relationship between Belief in the Paranormal and Statistical Bias: The Potential Role of Schizotypy

    PubMed Central

    Dagnall, Neil; Denovan, Andrew; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2016-01-01

    The present paper examined relationships between schizotypy (measured by the Oxford-Liverpool Inventory of Feelings and Experience; O-LIFE scale brief), belief in the paranormal (assessed via the Revised Paranormal Belief Scale; RPBS) and proneness to statistical bias (i.e., perception of randomness and susceptibility to conjunction fallacy). Participants were 254 volunteers recruited via convenience sampling. Probabilistic reasoning problems appeared framed within both standard and paranormal contexts. Analysis revealed positive correlations between the Unusual Experience (UnExp) subscale of O-LIFE and paranormal belief measures [RPBS full scale, traditional paranormal beliefs (TPB) and new age philosophy]. Performance on standard problems correlated negatively with UnExp and belief in the paranormal (particularly the TPB dimension of the RPBS). Consideration of specific problem types revealed that perception of randomness associated more strongly with belief in the paranormal than conjunction; both problem types related similarly to UnExp. Structural equation modeling specified that belief in the paranormal mediated the indirect relationship between UnExp and statistical bias. For problems presented in a paranormal context a framing effect occurred. Whilst UnExp correlated positively with conjunction proneness (controlling for perception of randomness), there was no association between UnExp and perception of randomness (controlling for conjunction). PMID:27471481

  5. Chlorophyll a and inorganic suspended solids in backwaters of the upper Mississippi River system: Backwater lake effects and their associations with selected environmental predictors

    USGS Publications Warehouse

    Rogala, James T.; Gray, Brian R.

    2006-01-01

    The Long Term Resource Monitoring Program (LTRMP) uses a stratified random sampling design to obtain water quality statistics within selected study reaches of the Upper Mississippi River System (UMRS). LTRMP sampling strata are based on aquatic area types generally found in large rivers (e.g., main channel, side channel, backwater, and impounded areas). For hydrologically well-mixed strata (i.e., main channel), variance associated with spatial scales smaller than the strata scale is a relatively minor issue for many water quality parameters. However, analysis of LTRMP water quality data has shown that within-strata variability at the strata scale is high in off-channel areas (i.e., backwaters). A portion of that variability may be associated with differences among individual backwater lakes (i.e., small and large backwater regions separated by channels) that cumulatively make up the backwater stratum. The objective of the statistical modeling presented here is to determine if differences among backwater lakes account for a large portion of the variance observed in the backwater stratum for selected parameters. If variance associated with backwater lakes is high, then inclusion of backwater lake effects within statistical models is warranted. Further, lakes themselves may represent natural experimental units where associations of interest to management may be estimated.

  6. Confirmatory Factor Analysis of the Malay Version of the Confusion, Hubbub and Order Scale (CHAOS-6) among Myocardial Infarction Survivors in a Malaysian Cardiac Healthcare Facility

    PubMed Central

    Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul

    2017-01-01

    Background The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. Methods The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version’s validity. Composite reliability was tested to determine the scale’s reliability. Results All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. Conclusion The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6), for the Malaysian population. PMID:28951688

  7. A statistical study on the F2 layer vertical variation during nighttime medium-scale traveling ionospheric disturbances

    NASA Astrophysics Data System (ADS)

    Ssessanga, Nicholas; Kim, Yong Ha; Jeong, Se-Heon

    2017-03-01

    A statistical study on the relationship between the perturbation component (ΔTEC (total electron content)) and the F2 layer peak height (hmF2) during nighttime medium-scale traveling ionospheric disturbances is presented. The results are obtained by using a time-dependent computerized ionospheric tomography (CIT) technique. This was realized by using slant total electron content observations from a dense Global Positioning System receiver network over Japan (with more than 1000 receivers), together with a multiplicative algebraic reconstruction technique. Reconstructions from CIT were validated by using ionosonde and occultation measurements. A total of 36 different time snapshots of the ionosphere when medium-scale traveling ionospheric disturbances (MSTIDs) were eminent were analyzed. These were obtained from a data set covering years from 2011 to 2014. The reconstructed surface wavefronts of ΔTEC and hmF2 structure were found to be aligned along the northwest-southeast direction. These results confirm that nighttime MSTIDs are driven by electrodynamic forces related to Perkins instability which explains the northwest-southeast wavefront alignment based on the F region electrodynamics. Furthermore, from the statistical analysis hmF2 varied quasiperiodically in altitude with dominant peak-to-peak amplitudes between 10 and 40 km. In addition, ΔTEC and hmF2 were 60% anticorrelated.

  8. Toward a Better Understanding of the Relationship between Belief in the Paranormal and Statistical Bias: The Potential Role of Schizotypy.

    PubMed

    Dagnall, Neil; Denovan, Andrew; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2016-01-01

    The present paper examined relationships between schizotypy (measured by the Oxford-Liverpool Inventory of Feelings and Experience; O-LIFE scale brief), belief in the paranormal (assessed via the Revised Paranormal Belief Scale; RPBS) and proneness to statistical bias (i.e., perception of randomness and susceptibility to conjunction fallacy). Participants were 254 volunteers recruited via convenience sampling. Probabilistic reasoning problems appeared framed within both standard and paranormal contexts. Analysis revealed positive correlations between the Unusual Experience (UnExp) subscale of O-LIFE and paranormal belief measures [RPBS full scale, traditional paranormal beliefs (TPB) and new age philosophy]. Performance on standard problems correlated negatively with UnExp and belief in the paranormal (particularly the TPB dimension of the RPBS). Consideration of specific problem types revealed that perception of randomness associated more strongly with belief in the paranormal than conjunction; both problem types related similarly to UnExp. Structural equation modeling specified that belief in the paranormal mediated the indirect relationship between UnExp and statistical bias. For problems presented in a paranormal context a framing effect occurred. Whilst UnExp correlated positively with conjunction proneness (controlling for perception of randomness), there was no association between UnExp and perception of randomness (controlling for conjunction).

  9. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  10. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  11. Scientific, statistical, practical, and regulatory considerations in design space development.

    PubMed

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  12. Complex Networks, Fractals and Topology Trends for Oxidative Activity of DNA in Cells for Populations of Fluorescing Neutrophils in Medical Diagnostics

    NASA Astrophysics Data System (ADS)

    Galich, N. E.

    A novel nonlinear statistical method of immunofluorescence data analysis is presented. The data of DNA fluorescence due to oxidative activity in neutrophils nuclei of peripheral blood is analyzed. Histograms of photon counts statistics are generated using flow cytometry method. The histograms represent the distributions of fluorescence flash frequency as functions of intensity for large populations∼104-105 of fluorescing cells. We have shown that these experiments present 3D-correlations of oxidative activity of DNA for full chromosomes set in cells with spatial resolution of measurements is about few nanometers in the flow direction the jet of blood. Detailed analysis showed that large-scale correlations in oxidative activity of DNA in cells are described as networks of small- worlds (complex systems with logarithmic scaling) with self own small-world networks for given donor at given time for all states of health. We observed changes in fractal networks of oxidative activity of DNA in neutrophils in vivo and during medical treatments for classification and diagnostics of pathologies for wide spectra of diseases. Our approach based on analysis of changes topology of networks (fractal dimension) at variation the scales of networks. We produce the general estimation of health status of a given donor in a form of yes/no of answers (healthy/sick) in the dependence on the sign of plus/minus in the trends change of fractal dimensions due to decreasing the scale of nets. We had noted the increasing biodiversity of neutrophils and stochastic (Brownian) character of intercellular correlations of different neutrophils in the blood of healthy donor. In the blood of sick people we observed the deterministic cell-cell correlations of neutrophils and decreasing their biodiversity.

  13. Relationships Among Use of Complementary and Alternative Interventions, Urinary Incontinence, Quality of Life, and Self-esteem in Women With Urinary Incontinence.

    PubMed

    Öz, Özge; Altay, Birsen

    The purpose of this study was to examine associations among sociodemographic characteristics, urinary incontinence (UI) characteristics, UI-specific quality of life and self-esteem, and use of complementary and alternative medicine (CAM) interventions for UI. Correlational-descriptive research. This sample comprised 394 female patients 18 years or older cared for in the urology and gynecology outpatient clinics of a university hospital in Samsun, Turkey. Participants completed an investigator-developed questionnaire that included 2 validated instruments, King's Health Questionnaire (KHQ) and the Rosenberg Self-esteem Scale. Descriptive statistics were used for demographic data and use of CAM interventions. Variables associated with CAM use were assessed using χ analysis. The differences between using CAM and scales points of the KHQ and the Rosenberg Self-esteem Scale were assessed using the t test, and the relationship between the KHQ and the Rosenberg Self-esteem Scale was assessed using correlation analysis. Thirty-three percent (n = 130) of women indicated using CAM interventions to manage their UI. The most common CAM intervention, reported by 52.6% of respondents, was prayer. Women with lower UI-specific quality of life and self-esteem scores were more likely to report using CAM interventions (P < .05). Women with lower education level used CAM more frequently than others (P < .05). Analysis revealed weak but statistically significant positive correlations for role limitations, physical limitations, social limitations, emotions, sleep/energy level and the symptom severity (P < .001), and personal relationships (P < .01) subdimensions of the KHQ. One-third of women indicated using CAM methods to manage their UI; the most commonly used intervention was prayer. Women using CAM reported both higher self-esteem and condition-specific health-related quality of life than women who did not use these interventions.

  14. Topology Analysis of the Sloan Digital Sky Survey. I. Scale and Luminosity Dependence

    NASA Astrophysics Data System (ADS)

    Park, Changbom; Choi, Yun-Young; Vogeley, Michael S.; Gott, J. Richard, III; Kim, Juhan; Hikage, Chiaki; Matsubara, Takahiko; Park, Myeong-Gu; Suto, Yasushi; Weinberg, David H.; SDSS Collaboration

    2005-11-01

    We measure the topology of volume-limited galaxy samples selected from a parent sample of 314,050 galaxies in the Sloan Digital Sky Survey (SDSS), which is now complete enough to describe the fully three-dimensional topology and its dependence on galaxy properties. We compare the observed genus statistic G(νf) to predictions for a Gaussian random field and to the genus measured for mock surveys constructed from new large-volume simulations of the ΛCDM cosmology. In this analysis we carefully examine the dependence of the observed genus statistic on the Gaussian smoothing scale RG from 3.5 to 11 h-1 Mpc and on the luminosity of galaxies over the range -22.50

  15. The development of an instrument to measure quality of vision: the Quality of Vision (QoV) questionnaire.

    PubMed

    McAlinden, Colm; Pesudovs, Konrad; Moore, Jonathan E

    2010-11-01

    To develop an instrument to measure subjective quality of vision: the Quality of Vision (QoV) questionnaire. A 30-item instrument was designed with 10 symptoms rated in each of three scales (frequency, severity, and bothersome). The QoV was completed by 900 subjects in groups of spectacle wearers, contact lens wearers, and those having had laser refractive surgery, intraocular refractive surgery, or eye disease and investigated with Rasch analysis and traditional statistics. Validity and reliability were assessed by Rasch fit statistics, principal components analysis (PCA), person separation, differential item functioning (DIF), item targeting, construct validity (correlation with visual acuity, contrast sensitivity, total root mean square [RMS] higher order aberrations [HOA]), and test-retest reliability (two-way random intraclass correlation coefficients [ICC] and 95% repeatability coefficients [R(c)]). Rasch analysis demonstrated good precision, reliability, and internal consistency for all three scales (mean square infit and outfit within 0.81-1.27; PCA >60% variance explained by the principal component; person separation 2.08, 2.10, and 2.01 respectively; and minimal DIF). Construct validity was indicated by strong correlations with visual acuity, contrast sensitivity and RMS HOA. Test-retest reliability was evidenced by a minimum ICC of 0.867 and a minimum 95% R(c) of 1.55 units. The QoV Questionnaire consists of a Rasch-tested, linear-scaled, 30-item instrument on three scales providing a QoV score in terms of symptom frequency, severity, and bothersome. It is suitable for measuring QoV in patients with all types of refractive correction, eye surgery, and eye disease that cause QoV problems.

  16. Searching for forcing signatures in decadal patterns of shoreline change

    NASA Astrophysics Data System (ADS)

    Burningham, H.; French, J.

    2016-12-01

    Analysis of shoreline position at spatial scales of the order 10 - 100 km and at a multi-decadal time-scale has the potential to reveal regional coherence (or lack of) in the primary controls on shoreline tendencies and trends. Such information is extremely valuable for the evaluation of climate forcing on coastal behaviour. Segmenting a coast into discrete behaviour units based on these types of analyses is often subjective, however, and in the context of pervasive human interventions and alongshore variability in ocean climate, determining the most important controls on shoreline dynamics can be challenging. Multivariate analyses provide one means to resolve common behaviours across shoreline position datasets, thereby underpinning a more objective evaluation of possible coupling between shorelines at different scales. In an analysis of the Suffolk coast (eastern England) we explore the use of multivariate statistics to understand and classify mesoscale coastal behaviour. Suffolk comprises a relatively linear shoreline that shifts from east-facing in the north to southeast-facing in the south. Although primarily formed of a beach foreshore backed by cliffs or shingle barrier, the shoreline is punctuated at 3 locations by narrow tidal inlets with offset entrances that imply a persistent north to south sediment transport direction. Tidal regime decreases south to north from mesotidal (3.6m STR) to microtidal (1.9m STR), and the bimodal wave climate (northeast and southwest modes) presents complex local-scale variability in nearshore conditions. Shorelines exhibit a range of decadal behaviours from rapid erosion (up to 4m/yr) to quasi-stability that cannot be directly explained by the spatial organisation of contemporary landforms or coastal defences. A multivariate statistical approach to shoreline change analysis helps to define the key modes of change and determine the most likely forcing factors.

  17. The minimal SUSY B - L model: simultaneous Wilson lines and string thresholds

    DOE PAGES

    Deen, Rehan; Ovrut, Burt A.; Purves, Austin

    2016-07-08

    In previous work, we presented a statistical scan over the soft supersymmetry breaking parameters of the minimal SUSY B - L model. For specificity of calculation, unification of the gauge parameters was enforced by allowing the two Z 3×Z 3 Wilson lines to have mass scales separated by approximately an order of magnitude. This introduced an additional “left-right” sector below the unification scale. In this paper, for three important reasons, we modify our previous analysis by demanding that the mass scales of the two Wilson lines be simultaneous and equal to an “average unification” mass U >. The present analysismore » is 1) more “natural” than the previous calculations, which were only valid in a very specific region of the Calabi-Yau moduli space, 2) the theory is conceptually simpler in that the left-right sector has been removed and 3) in the present analysis the lack of gauge unification is due to threshold effects — particularly heavy string thresholds, which we calculate statistically in detail. As in our previous work, the theory is renormalization group evolved from U > to the electroweak scale — being subjected, sequentially, to the requirement of radiative B - L and electroweak symmetry breaking, the present experimental lower bounds on the B - L vector boson and sparticle masses, as well as the lightest neutral Higgs mass of ~125 GeV. The subspace of soft supersymmetry breaking masses that satisfies all such constraints is presented and shown to be substantial.« less

  18. A comparison of three methods of assessing differential item functioning (DIF) in the Hospital Anxiety Depression Scale: ordinal logistic regression, Rasch analysis and the Mantel chi-square procedure.

    PubMed

    Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C

    2014-12-01

    It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.

  19. An Automated Method of Scanning Probe Microscopy (SPM) Data Analysis and Reactive Site Tracking for Mineral-Water Interface Reactions Observed at the Nanometer Scale

    NASA Astrophysics Data System (ADS)

    Campbell, B. D.; Higgins, S. R.

    2008-12-01

    Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.

  20. Braden scale (ALB) for assessing pressure ulcer risk in hospital patients: A validity and reliability study.

    PubMed

    Chen, Hong-Lin; Cao, Ying-Juan; Zhang, Wei; Wang, Jing; Huai, Bao-Sha

    2017-02-01

    The inter-rater reliability of Braden Scale is not so good. We modified the Braden(ALB) scale by defining nutrition subscale based on serum albumin, then assessed it's the validity and reliability in hospital patients. We designed a retrospective study for validity analysis, and a prospective study for reliability analysis. Receiver operating curve (ROC) and area under the curve (AUC) were used to evaluate the predictive validity. Intra-class correlation coefficient (ICC) was used to investigate the inter-rater reliability. Two thousand five hundred twenty-five patients were included for validity analysis, 76 patients (3.0%) developed pressure ulcer. Positive correlation was found between serum albumin and nutrition score in Braden scale (Spearman's coefficient 0.2203, P<0.0001). The AUCs for Braden scale and Braden(ALB) scale predicting pressure ulcer risk were 0.813 (95% CI 0.797-0.828; P<0.0001), and 0.859 (95% CI 0.845-0.872; P<0.0001), respectively. The Braden(ALB) scale was even more valid than the Braden scale (z=1.860, P=0.0628). In different age subgroups, the Braden(ALB) scale seems also more valid than the original Braden scale, but no statistically significant differences were found (P>0.05). The inter-rater reliability study showed the ICC-value for nutrition increased 45.9%, and increased 4.3% for total score. The Braden(ALB) scale has similar validity compared with the original Braden scale for in hospital patients. However, the inter-rater reliability was significantly increased. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Associations Between Attitudes Toward Physical Education and Aerobic Capacity in Hungarian High School Students.

    PubMed

    Kaj, Mónika; Saint-Maurice, Pedro F; Karsai, István; Vass, Zoltán; Csányi, Tamás; Boronyai, Zoltán; Révész, László

    2015-06-26

    The purpose of this study was to create a physical education (PE) attitude scale and examine how it is associated with aerobic capacity (AC). Participants (n = 961, aged 15-20 years) were randomly selected from 26 Hungarian high schools. AC was estimated from performance on the Progressive Aerobic Cardiovascular and Endurance Run test, and the attitude scale had 31 items measured on a Likert scale that ranged from 1 to 5. Principal component analysis was used to examine the structure of the questionnaire while several 2-way analyses of variance and multiple linear regression (MLR) were computed to examine trends in AC and test the association between component scores obtained from the attitude scale and AC, respectively. Five components were identified: health orientation in PE (C1), avoid failure in PE (C2), success orientation in PE (C3), attitude toward physical activity (C4), and cooperation and social experience in PE (C5). There was a statistically significant main effect for sex on C3 (p < .05), C4 (p < .001), and C5 (p < .05) indicating that boys' scores were higher than girls. The Sex × Age interaction was also statistically significant (p < .05) and follow-up comparisons revealed sex differences in C5 for 15-year-old participants. Girls showed statistically significant higher values than boys in C5 at the age of 16 years. MLR results revealed that component scores were significantly associated with AC (p < .05). Statistically significant predictors included C1, C2, C3, and C4 for boys and C2 and C4 for girls. The 5-component scale seems to be suitable for measuring students' attitudes toward PE. The design of the study does not allow for direct associations between attitudes and AC but suggests that these 2 might be related.

  2. Scaling properties of the two-dimensional randomly stirred Navier-Stokes equation.

    PubMed

    Mazzino, Andrea; Muratore-Ginanneschi, Paolo; Musacchio, Stefano

    2007-10-05

    We inquire into the scaling properties of the 2D Navier-Stokes equation sustained by a force field with Gaussian statistics, white noise in time, and with a power-law correlation in momentum space of degree 2 - 2 epsilon. This is at variance with the setting usually assumed to derive Kraichnan's classical theory. We contrast accurate numerical experiments with the different predictions provided for the small epsilon regime by Kraichnan's double cascade theory and by renormalization group analysis. We give clear evidence that for all epsilon, Kraichnan's theory is consistent with the observed phenomenology. Our results call for a revision in the renormalization group analysis of (2D) fully developed turbulence.

  3. When human walking becomes random walking: fractal analysis and modeling of gait rhythm fluctuations

    NASA Astrophysics Data System (ADS)

    Hausdorff, Jeffrey M.; Ashkenazy, Yosef; Peng, Chang-K.; Ivanov, Plamen Ch.; Stanley, H. Eugene; Goldberger, Ary L.

    2001-12-01

    We present a random walk, fractal analysis of the stride-to-stride fluctuations in the human gait rhythm. The gait of healthy young adults is scale-free with long-range correlations extending over hundreds of strides. This fractal scaling changes characteristically with maturation in children and older adults and becomes almost completely uncorrelated with certain neurologic diseases. Stochastic modeling of the gait rhythm dynamics, based on transitions between different “neural centers”, reproduces distinctive statistical properties of the gait pattern. By tuning one model parameter, the hopping (transition) range, the model can describe alterations in gait dynamics from childhood to adulthood - including a decrease in the correlation and volatility exponents with maturation.

  4. The calibration analysis of soil infiltration formula in farmland scale

    NASA Astrophysics Data System (ADS)

    Qian, Tao; Han, Na Na; Chang, Shuan Ling

    2018-06-01

    Soil infiltration characteristic is an important basis of farmland scale parameter estimation. Based on 12 groups of double-loop infiltration tests conducted in the test field of tianjin agricultural university west campus. Based on the calibration theory and the combination of statistics, the calibration analysis of phillips formula was carried out and the spatial variation characteristics of the calibration factor were analyzed. Results show that in study area based on the soil stability infiltration rate A calculate calibration factor αA calibration effect is best, that is suitable for the area formula of calibration infiltration and αA variation coefficient is 0.3234, with A certain degree of spatial variability.

  5. An Analysis of the Associations between Ambiguity Tolerance and EFL Reading Strategy Awareness

    ERIC Educational Resources Information Center

    Kamran, Saeedeh Karbalaee; Maftoon, Parviz

    2012-01-01

    The current study is an attempt to investigate whether any statistically significant relationship existed between Iranian EFL learners' ambiguity tolerance (AT) and their reading strategy use. To this end, three instruments of Survey of Reading Strategy (Mokhtari & Sheorey, 2002), Second Language Ambiguity Tolerance Scale (Ely, 1995), and a…

  6. Teaching: An Option for Mid-Life Retirees.

    ERIC Educational Resources Information Center

    Bell, David

    This document identifies patterns of characteristics of those who have leisure as an option at mid-life. A comparison was made between individuals electing to enter teaching and those electing to pursue leisure at this life stage. Results of structured interviews, statistical results, and an analysis of a life satisfaction scale is given. In…

  7. When does biodiversity matter? Assessing ecosystem services across broad regions using forest inventory and analysis data

    Treesearch

    Kevin M. Potter; Christopher W. Woodall; Christopher M. Oswalt; Basil V. III Iannone; Songlin Fei

    2015-01-01

    Biodiversity is expected to convey numerous functional benefits to forested ecosystems, including increased productivity and resilience. When assessing biodiversity, however, statistics that account for evolutionary relationships among species may be more ecologically meaningful than traditional measures such as species richness. In three broad-scale studies, we...

  8. Introduction to Multilevel Item Response Theory Analysis: Descriptive and Explanatory Models

    ERIC Educational Resources Information Center

    Sulis, Isabella; Toland, Michael D.

    2017-01-01

    Item response theory (IRT) models are the main psychometric approach for the development, evaluation, and refinement of multi-item instruments and scaling of latent traits, whereas multilevel models are the primary statistical method when considering the dependence between person responses when primary units (e.g., students) are nested within…

  9. Validity Evidence in Scale Development: The Application of Cross Validation and Classification-Sequencing Validation

    ERIC Educational Resources Information Center

    Acar, Tu¨lin

    2014-01-01

    In literature, it has been observed that many enhanced criteria are limited by factor analysis techniques. Besides examinations of statistical structure and/or psychological structure, such validity studies as cross validation and classification-sequencing studies should be performed frequently. The purpose of this study is to examine cross…

  10. Quantitative classification of a historic northern Wisconsin (U.S.A.) landscape: mapping forests at regional scales

    Treesearch

    Lisa A. Schulte; David J. Mladenoff; Erik V. Nordheim

    2002-01-01

    We developed a quantitative and replicable classification system to improve understanding of historical composition and structure within northern Wisconsin's forests. The classification system was based on statistical cluster analysis and two forest metrics, relative dominance (% basal area) and relative importance (mean of relative dominance and relative density...

  11. Descriptive Analysis of Student Ratings

    ERIC Educational Resources Information Center

    Marasini, Donata; Quatto, Piero

    2011-01-01

    Let X be a statistical variable representing student ratings of University teaching. It is natural to assume for X an ordinal scale consisting of k categories (in ascending order of satisfaction). At first glance, student ratings can be summarized by a location index (such as the mode or the median of X) associated with a convenient measure of…

  12. AG Channel Measurement and Modeling Results for Over-Water and Hilly Terrain Conditions

    NASA Technical Reports Server (NTRS)

    Matolak, David W.; Sun, Ruoyu

    2015-01-01

    This report describes work completed over the past year on our project, entitled "Unmanned Aircraft Systems (UAS) Research: The AG Channel, Robust Waveforms, and Aeronautical Network Simulations." This project is funded under the NASA project "Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS)." In this report we provide the following: an update on project progress; a description of the over-freshwater and hilly terrain initial results on path loss, delay spread, small-scale fading, and correlations; complete path loss models for the over-water AG channels; analysis for obtaining parameter statistics required for development of accurate wideband AG channel models; and analysis of an atypical AG channel in which the aircraft flies out of the ground site antenna main beam. We have modeled the small-scale fading of these channels with Ricean statistics, and have quantified the behavior of the Ricean K-factor. We also provide some results for correlations of signal components, both intra-band and inter-band. An updated literature review, and a summary that also describes future work, are also included.

  13. [A comparative study on depression symptoms in Cracow primary school kids in years 1984 and 2001].

    PubMed

    Modrzejewska, Renata; Bomba, Jacek

    2009-01-01

    The study aims to find out whether or not the image of depression in children has remained related to the changing social context within a period of fifteen years. For the depression study, version AO "B1" of the Kraków Depression Inventory (KID) was used. The subject group included 10-year-old fourth-form Kraków primary school students selected in 1984 and 2001 by two-stage draw. The analysis included subjects with a screening diagnosis of depression. In 1984, this was a group of 160, and in 2001-200 persons. Statistical analysis showed significant differences between the groups of pupils under study in 1984 and 2001 as far as the scale of mood disorder is concerned (a reduction in the intensity of symptoms in girls and an increase in boys) and somatic symptoms (increase of symptoms in girls and a reduction in the intensity of symptoms in boys). On other scales, no differences of statistical significance were found between the groups. CONCLUSIONS. The changing social conditions have a relatively low effect on the symptomatic depression image in preadolescent children.

  14. Slow and fast solar wind - data selection and statistical analysis

    NASA Astrophysics Data System (ADS)

    Wawrzaszek, Anna; Macek, Wiesław M.; Bruno, Roberto; Echim, Marius

    2014-05-01

    In this work we consider the important problem of selection of slow and fast solar wind data measured in-situ by the Ulysses spacecraft during two solar minima (1995-1997, 2007-2008) and solar maximum (1999-2001). To recognise different types of solar wind we use a set of following parameters: radial velocity, proton density, proton temperature, the distribution of charge states of oxygen ions, and compressibility of magnetic field. We present how this idea of the data selection works on Ulysses data. In the next step we consider the chosen intervals for fast and slow solar wind and perform statistical analysis of the fluctuating magnetic field components. In particular, we check the possibility of identification of inertial range by considering the scale dependence of the third and fourth orders scaling exponents of structure function. We try to verify the size of inertial range depending on the heliographic latitudes, heliocentric distance and phase of the solar cycle. Research supported by the European Community's Seventh Framework Programme (FP7/2007 - 2013) under grant agreement no 313038/STORM.

  15. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Gyekenyesi, John P.

    1988-01-01

    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  16. Quantitative Analysis of Repertoire Scale Immunoglobulin properties in Vaccine Induced B cell Responses

    DTIC Science & Technology

    Immunosequencing now readily generates 103105 sequences per sample ; however, statistical analysis of these repertoires is challenging because of the high genetic...diversity of BCRs and the elaborate clonal relationships among them. To date, most immunosequencing analyses have focused on reporting qualitative ...repertoire differences, (2) identifying how two repertoires differ, and (3) determining appropriate confidence intervals for assessing the size of the differences and their potential biological relevance.

  17. Ultrascalable Techniques Applied to the Global Intelligence Community Information Awareness Common Operating Picture (IA COP)

    DTIC Science & Technology

    2005-11-01

    more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates

  18. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  19. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  20. Angular filter refractometry analysis using simulated annealing.

    PubMed

    Angland, P; Haberberger, D; Ivancic, S T; Froula, D H

    2017-10-01

    Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.

  1. Wavelet analysis of birefringence images of myocardium tissue

    NASA Astrophysics Data System (ADS)

    Sakhnovskiy, M. Yu.; Ushenko, Yu. O.; Kushnerik, L.; Soltys, I. V.; Pavlyukovich, N.; Pavlyukovich, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of azimuthally-invariant Mueller-matrix description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of Mueller-matrix invariants (MMI) of linear and circular birefringences of skeletal muscle tissue. It was defined the values of statistic moments, which characterize the distributions of amplitudes of wavelet coefficients of MMI at different scales of scanning. The second part presents the data of statistic analysis of the distributions of amplitude of wavelet coefficients of the distributions of linear birefringence of myocardium tissue died after the infarction and ischemic heart disease. It was defined the objective criteria of differentiation of the cause of death.

  2. Wavelet analysis of polarization azimuths maps for laser images of myocardial tissue for the purpose of diagnosing acute coronary insufficiency

    NASA Astrophysics Data System (ADS)

    Wanchuliak, O. Ya.; Peresunko, A. P.; Bakko, Bouzan Adel; Kushnerick, L. Ya.

    2011-09-01

    This paper presents the foundations of a large scale - localized wavelet - polarization analysis - inhomogeneous laser images of histological sections of myocardial tissue. Opportunities were identified defining relations between the structures of wavelet coefficients and causes of death. The optical model of polycrystalline networks of myocardium protein fibrils is presented. The technique of determining the coordinate distribution of polarization azimuth of the points of laser images of myocardium histological sections is suggested. The results of investigating the interrelation between the values of statistical (statistical moments of the 1st-4th order) parameters are presented which characterize distributions of wavelet - coefficients polarization maps of myocardium layers and death reasons.

  3. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    PubMed

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  4. Willingness of minorities to participate in biomedical studies: confirmatory findings from a follow-up study using the Tuskegee Legacy Project Questionnaire.

    PubMed

    Katz, Ralph V; Green, B Lee; Kressin, Nancy R; Claudio, Cristina; Wang, Min Qi; Russell, Stefanie L

    2007-09-01

    The purposes of this analysis were to compare the self-reported willingness of blacks, Puerto-Rican Hispanics and whites to participate as research subjects in biomedical studies, and to determine the reliability of the Tuskegee Legacy Project Questionnaire (TLP). The TLP Questionnaire, initially used in a four-city study in 1999-2000, was administered in a follow-up study within a random-digit-dial telephone survey to a stratified random sample of adults in three different U.S. cities: Baltimore, MD; New York City; and San Juan, PR. The questionnaire, a 60-item instrument, contains two validated scales: the Likelihood of Participation (LOP) Scale and the Guinea Pig Fear Factor (GPFF) Scale. Adjusting for age, sex, education, income and city, the LOP Scale was not statistically significantly different for the racial/ethnic groups (ANCOVA, p=87). The GPFF Scale was statistically significantly higher for blacks and Hispanics as compared to whites (adjusted ANCOVA, p<0.001). The of the findings from the current three-city study, as well as from our prior four-city study, are remarkably similar and reinforce the conclusion that blacks and Hispanics self-report that, despite having a higher fear of participation, they are just as likely as whites to participate in biomedical research.

  5. Statistical downscaling of general-circulation-model- simulated average monthly air temperature to the beginning of flowering of the dandelion (Taraxacum officinale) in Slovenia

    NASA Astrophysics Data System (ADS)

    Bergant, Klemen; Kajfež-Bogataj, Lučka; Črepinšek, Zalika

    2002-02-01

    Phenological observations are a valuable source of information for investigating the relationship between climate variation and plant development. Potential climate change in the future will shift the occurrence of phenological phases. Information about future climate conditions is needed in order to estimate this shift. General circulation models (GCM) provide the best information about future climate change. They are able to simulate reliably the most important mean features on a large scale, but they fail on a regional scale because of their low spatial resolution. A common approach to bridging the scale gap is statistical downscaling, which was used to relate the beginning of flowering of Taraxacum officinale in Slovenia with the monthly mean near-surface air temperature for January, February and March in Central Europe. Statistical models were developed and tested with NCAR/NCEP Reanalysis predictor data and EARS predictand data for the period 1960-1999. Prior to developing statistical models, empirical orthogonal function (EOF) analysis was employed on the predictor data. Multiple linear regression was used to relate the beginning of flowering with expansion coefficients of the first three EOF for the Janauary, Febrauary and March air temperatures, and a strong correlation was found between them. Developed statistical models were employed on the results of two GCM (HadCM3 and ECHAM4/OPYC3) to estimate the potential shifts in the beginning of flowering for the periods 1990-2019 and 2020-2049 in comparison with the period 1960-1989. The HadCM3 model predicts, on average, 4 days earlier occurrence and ECHAM4/OPYC3 5 days earlier occurrence of flowering in the period 1990-2019. The analogous results for the period 2020-2049 are a 10- and 11-day earlier occurrence.

  6. Evaluation of a New Mean Scaled and Moment Adjusted Test Statistic for SEM

    ERIC Educational Resources Information Center

    Tong, Xiaoxiao; Bentler, Peter M.

    2013-01-01

    Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and 2 well-known robust test…

  7. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  8. Lagrangian statistics of mesoscale turbulence in a natural environment: The Agulhas return current.

    PubMed

    Carbone, Francesco; Gencarelli, Christian N; Hedgecock, Ian M

    2016-12-01

    The properties of mesoscale geophysical turbulence in an oceanic environment have been investigated through the Lagrangian statistics of sea surface temperature measured by a drifting buoy within the Agulhas return current, where strong temperature mixing produces locally sharp temperature gradients. By disentangling the large-scale forcing which affects the small-scale statistics, we found that the statistical properties of intermittency are identical to those obtained from the multifractal prediction in the Lagrangian frame for the velocity trajectory. The results suggest a possible universality of turbulence scaling.

  9. Characterizing and understanding the climatic determinism of high- to low-frequency variations in precipitation in northwestern France using a coupled wavelet multiresolution/statistical downscaling approach

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Dieppois, Bastien; Hannah, David; Lavers, David; Fossa, Manuel; Laignel, Benoit; Debret, Maxime

    2017-04-01

    Geophysical signals oscillate over several time-scales that explain different amount of their overall variability and may be related to different physical processes. Characterizing and understanding such variabilities in hydrological variations and investigating their determinism is one important issue in a context of climate change, as these variabilities can be occasionally superimposed to long-term trend possibly due to climate change. It is also important to refine our understanding of time-scale dependent linkages between large-scale climatic variations and hydrological responses on the regional or local-scale. Here we investigate such links by conducting a wavelet multiresolution statistical dowscaling approach of precipitation in northwestern France (Seine river catchment) over 1950-2016 using sea level pressure (SLP) and sea surface temperature (SST) as indicators of atmospheric and oceanic circulations, respectively. Previous results demonstrated that including multiresolution decomposition in a statistical downscaling model (within a so-called multiresolution ESD model) using SLP as large-scale predictor greatly improved simulation of low-frequency, i.e. interannual to interdecadal, fluctuations observed in precipitation. Building on these results, continuous wavelet transform of simulated precipiation using multiresolution ESD confirmed the good performance of the model to better explain variability at all time-scales. A sensitivity analysis of the model to the choice of the scale and wavelet function used was also tested. It appeared that whatever the wavelet used, the model performed similarly. The spatial patterns of SLP found as the best predictors for all time-scales, which resulted from the wavelet decomposition, revealed different structures according to time-scale, showing possible different determinisms. More particularly, some low-frequency components ( 3.2-yr and 19.3-yr) showed a much wide-spread spatial extentsion across the Atlantic. Moreover, in accordance with other previous studies, the wavelet components detected in SLP and precipitation on interannual to interdecadal time-scales could be interpreted in terms of influence of the Gulf-Stream oceanic front on atmospheric circulation. Current works are now conducted including SST over the Atlantic in order to get further insights into this mechanism.

  10. Measures of large-scale structure in the CfA redshift survey slices

    NASA Technical Reports Server (NTRS)

    De Lapparent, Valerie; Geller, Margaret J.; Huchra, John P.

    1991-01-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall.

  11. INTERPRETATION OF THE STRUCTURE FUNCTION OF ROTATION MEASURE IN THE INTERSTELLAR MEDIUM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Siyao; Zhang, Bing, E-mail: syxu@pku.edu.cn, E-mail: zhang@physics.unlv.edu

    2016-06-20

    The observed structure function (SF) of rotation measure (RM) varies as a broken power-law function of angular scales. The systematic shallowness of its spectral slope is inconsistent with the standard Kolmogorov scaling. This motivates us to examine the statistical analysis on RM fluctuations. The correlations of RM constructed by Lazarian and Pogosyan are demonstrated to be adequate in explaining the observed features of RM SFs through a direct comparison between the theoretically obtained and observationally measured SF results. By segregating the density and magnetic field fluctuations and adopting arbitrary indices for their respective power spectra, we find that when themore » SFs of RM and emission measure have a similar form over the same range of angular scales, the statistics of the RM fluctuations reflect the properties of density fluctuations. RM SFs can be used to evaluate the mean magnetic field along the line of sight, but cannot serve as an informative source on the properties of turbulent magnetic field in the interstellar medium. We identify the spectral break of RM SFs as the inner scale of a shallow spectrum of electron density fluctuations, which characterizes the typical size of discrete electron density structures in the observed region.« less

  12. Is There a Critical Distance for Fickian Transport? - a Statistical Approach to Sub-Fickian Transport Modelling in Porous Media

    NASA Astrophysics Data System (ADS)

    Most, S.; Nowak, W.; Bijeljic, B.

    2014-12-01

    Transport processes in porous media are frequently simulated as particle movement. This process can be formulated as a stochastic process of particle position increments. At the pore scale, the geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Recent experimental data suggest that we have not yet reached the end of the need to generalize, because particle increments show statistical dependency beyond linear correlation and over many time steps. The goal of this work is to better understand the validity regions of commonly made assumptions. We are investigating after what transport distances can we observe: A statistical dependence between increments, that can be modelled as an order-k Markov process, boils down to order 1. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks would start. A bivariate statistical dependence that simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW). Complete absence of statistical dependence (validity of classical PTRW/CTRW). The approach is to derive a statistical model for pore-scale transport from a powerful experimental data set via copula analysis. The model is formulated as a non-Gaussian, mutually dependent Markov process of higher order, which allows us to investigate the validity ranges of simpler models.

  13. The effect of nurses' ethical leadership and ethical climate perceptions on job satisfaction.

    PubMed

    Özden, Dilek; Arslan, Gülşah Gürol; Ertuğrul, Büşra; Karakaya, Salih

    2017-01-01

    The development of ethical leadership approaches plays an important role in achieving better patient care. Although studies that analyze the impact of ethical leadership on ethical climate and job satisfaction have gained importance in recent years, there is no study on ethical leadership and its relation to ethical climate and job satisfaction in our country. This descriptive and cross-sectional study aimed to determine the effect of nurses' ethical leadership and ethical climate perceptions on their job satisfaction. The study sample is composed of 285 nurses who agreed to participate in this research and who work at the internal, surgical, and intensive care units of a university hospital and a training and research hospital in İzmir, Turkey. Data were collected using Ethical Leadership Scale, Hospital Ethical Climate Scale, and Minnesota Satisfaction Scale. While the independent sample t-test, analysis of variance, Mann-Whitney U test, and Kruskal-Wallis test were used to analyze the data, the correlation analysis was used to determine the relationship between the scales. Ethical considerations: The study proposal was approved by the ethics committee of the Faculty of Medicine, Dokuz Eylül University. The nurses' mean scores were 59.05 ± 14.78 for the ethical leadership, 92.62 ± 17 for the ethical climate, and 62.15 ± 13.46 for the job satisfaction. The correlation between the nurses' ethical leadership and ethical climate mean scores was moderately positive and statistically significant (r = +0.625, p = 0.000), was weak but statistically significant between their ethical leadership and job satisfaction mean scores (r = +0.461, p = 0.000), and was moderately positive and statistically significant between their ethical climate and job satisfaction mean scores (r = +0.603, p = 0.000). The nurses' ethical leadership, ethical climate, and job satisfaction levels are moderate, and there is a positive relationship between them. The nurses' perceptions of ethical leadership are influenced by their educational status, workplace, and length of service.

  14. Analysing attitude data through ridit schemes.

    PubMed

    El-rouby, M G

    1994-12-02

    The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.

  15. A worldwide analysis of the impact of forest cover change on annual runoff across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Liu, S.

    2017-12-01

    Despite extensive studies on hydrological responses to forest cover change in small watersheds, the hydrological responses to forest change and associated mechanisms across multiple spatial scales have not been fully understood. This review thus examined about 312 watersheds worldwide to provide a generalized framework to evaluate hydrological responses to forest cover change and to identify the contribution of spatial scale, climate, forest type and hydrological regime in determining the intensity of forest change related hydrological responses in small (<1000 km2) and large watersheds (≥1000 km2). Key findings include: 1) the increase in annual runoff associated with forest cover loss is statistically significant at multiple spatial scales whereas the effect of forest cover gain is statistically inconsistent; 2) the sensitivity of annual runoff to forest cover change tends to attenuate as watershed size increases only in large watersheds; 3) annual runoff is more sensitive to forest cover change in water-limited watersheds than in energy-limited watersheds across all spatial scales; and 4) small mixed forest-dominated watersheds or large snow-dominated watersheds are more hydrologically resilient to forest cover change. These findings improve the understanding of hydrological response to forest cover change at different spatial scales and provide a scientific underpinning to future watershed management in the context of climate change and increasing anthropogenic disturbances.

  16. The Impact of Medication Anticholinergic Burden on Cognitive Performance in People With Schizophrenia.

    PubMed

    Ang, Mei San; Abdul Rashid, Nur Amirah; Lam, Max; Rapisarda, Attilio; Kraus, Michael; Keefe, Richard S E; Lee, Jimmy

    2017-12-01

    Cognitive deficits are prevalent in people with schizophrenia and associated with functional impairments. In addition to antipsychotics, pharmacotherapy in schizophrenia often includes other psychotropics, and some of these agents possess anticholinergic properties, which may impair cognition. The objective of this study was to explore the association between medication anticholinergic burden and cognition in schizophrenia. Seven hundred five individuals with schizophrenia completed a neuropsychological battery comprising Judgment of Line Orientation Test, Wechsler Abbreviated Scale of Intelligence Matrix Reasoning, Continuous Performance Test-Identical Pairs Version, and the Brief Assessment of Cognition in Schizophrenia. Cognitive g and 3 cognitive factor scores that include executive function, memory/fluency, and speed of processing/vigilance, which were derived from a previously published analysis, were entered as cognitive variables. Anticholinergic burden was computed using 2 anticholinergic scales: Anticholinergic Burden Scale and Anticholinergic Drug Scale. Duration and severity of illness, antipsychotic dose, smoking status, age, and sex were included as covariates. Anticholinergic burden was associated with poorer cognitive performance in cognitive g, all 3 cognitive domains and most cognitive tasks in multivariate analyses. The associations were statistically significant, but the effect sizes were small (for Anticholinergic Burden Scale, Cohen f = 0.008; for Anticholinergic Drug Scale, Cohen f = 0.017). Although our results showed a statistically significant association between medications with anticholinergic properties and cognition in people with schizophrenia, the impact is of doubtful or minimal clinical significance.

  17. Climate Change Assessment of Precipitation in Tandula Reservoir System

    NASA Astrophysics Data System (ADS)

    Jaiswal, Rahul Kumar; Tiwari, H. L.; Lohani, A. K.

    2018-02-01

    The precipitation is the principle input of hydrological cycle affect availability of water in spatial and temporal scale of basin due to widely accepted climate change. The present study deals with the statistical downscaling using Statistical Down Scaling Model for rainfall of five rain gauge stations (Ambagarh, Bhanpura, Balod, Chamra and Gondli) in Tandula, Kharkhara and Gondli reservoirs of Chhattisgarh state of India to forecast future rainfall in three different periods under SRES A1B and A2 climatic forcing conditions. In the analysis, twenty-six climatic variables obtained from National Centers for Environmental Prediction were used and statistically tested for selection of best-fit predictors. The conditional process based statistical correlation was used to evolve multiple linear relations in calibration for period of 1981-1995 was tested with independent data of 1996-2003 for validation. The developed relations were further used to predict future rainfall scenarios for three different periods 2020-2035 (FP-1), 2046-2064 (FP-2) and 2081-2100 (FP-3) and compared with monthly rainfalls during base period (1981-2003) for individual station and all three reservoir catchments. From the analysis, it has been found that most of the rain gauge stations and all three reservoir catchments may receive significant less rainfall in future. The Thiessen polygon based annual and seasonal rainfall for different catchments confirmed a reduction of seasonal rainfall from 5.1 to 14.1% in Tandula reservoir, 11-19.2% in Kharkhara reservoir and 15.1-23.8% in Gondli reservoir. The Gondli reservoir may be affected the most in term of water availability in future prediction periods.

  18. The Effects of Perioperative Music Interventions in Pediatric Surgery: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    van der Heijden, Marianne J. E.; Oliai Araghi, Sadaf; van Dijk, Monique; Jeekel, Johannes; Hunink, M. G. Myriam

    2015-01-01

    Objective Music interventions are widely used, but have not yet gained a place in guidelines for pediatric surgery or pediatric anesthesia. In this systematic review and meta-analysis we examined the effects of music interventions on pain, anxiety and distress in children undergoing invasive surgery. Data Sources We searched 25 electronic databases from their first available date until October 2014. Study Selection Included were all randomized controlled trials with a parallel group, crossover or cluster design that included pediatric patients from 1 month to 18 years old undergoing minimally invasive or invasive surgical procedures, and receiving either live music therapy or recorded music. Data Extraction and Synthesis 4846 records were retrieved from the searches, 26 full text reports were evaluated and data was extracted by two independent investigators. Main Outcome Measures Pain was measured with the Visual Analogue Scale, the Coloured Analogue Scale and the Facial Pain Scale. Anxiety and distress were measured with an emotional index scale (not validated), the Spielberger short State Trait Anxiety Inventory and a Facial Affective Scale. Results Three RCTs were eligible for inclusion encompassing 196 orthopedic, cardiac and day surgery patients (age of 1 day to 18 years) receiving either live music therapy or recorded music. Overall a statistically significant positive effect was demonstrated on postoperative pain (SMD -1.07; 95%CI-2.08; -0.07) and on anxiety and distress (SMD -0.34 95% CI -0.66; -0.01 and SMD -0.50; 95% CI -0.84; - 0.16. Conclusions and Relevance This systematic review and meta-analysis indicates that music interventions may have a statistically significant effect in reducing post-operative pain, anxiety and distress in children undergoing a surgical procedure. Evidence from this review and other reviews suggests music therapy may be considered for clinical use. PMID:26247769

  19. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  20. The role of enamel thickness and refractive index on human tooth colour.

    PubMed

    Oguro, Rena; Nakajima, Masatoshi; Seki, Naoko; Sadr, Alireza; Tagami, Junji; Sumi, Yasunori

    2016-08-01

    To investigate the role of enamel thickness and refractive index (n) on tooth colour. The colour and enamel thickness of fifteen extracted human central incisors were determined according to CIELab colour scale using spectrophotometer (Crystaleye) and swept-source optical coherence tomography (SS-OCT), respectively. Subsequently, labial enamel was trimmed by approximately 100μm, and the colour and remaining enamel thickness were investigated again. This cycle was repeated until dentin appeared. Enamel blocks were prepared from the same teeth and their n were obtained using SS-OCT. Multiple regression analysis was performed to reveal any effects of enamel thickness and n on colour difference (ΔE00) and differences in colour parameters with CIELCh and CIELab colour scales. Multiple regression analysis revealed that enamel thickness (p=0.02) and n of enamel (p<0.001) were statistically significant predictors of ΔE00 after complete enamel trimming. The n was also a significant predictor of ΔH' (p=0.01). Enamel thickness and n were not statistically significant predictors of ΔL', ΔC', Δa* and Δb*. Enamel affected tooth colour, in which n was a statistically significant predictor for tooth colour change. Understanding the role of enamel in tooth colour could contribute to development of aesthetic restorative materials that mimic the colour of natural tooth with minimal reduction of the existing enamel. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. An observational method for fast stochastic X-ray polarimetry timing

    NASA Astrophysics Data System (ADS)

    Ingram, Adam R.; Maccarone, Thomas J.

    2017-11-01

    The upcoming launch of the first space based X-ray polarimeter in ˜40 yr will provide powerful new diagnostic information to study accreting compact objects. In particular, analysis of rapid variability of the polarization degree and angle will provide the opportunity to probe the relativistic motions of material in the strong gravitational fields close to the compact objects, and enable new methods to measure black hole and neutron star parameters. However, polarization properties are measured in a statistical sense, and a statistically significant polarization detection requires a fairly long exposure, even for the brightest objects. Therefore, the sub-minute time-scales of interest are not accessible using a direct time-resolved analysis of polarization degree and angle. Phase-folding can be used for coherent pulsations, but not for stochastic variability such as quasi-periodic oscillations. Here, we introduce a Fourier method that enables statistically robust detection of stochastic polarization variability for arbitrarily short variability time-scales. Our method is analogous to commonly used spectral-timing techniques. We find that it should be possible in the near future to detect the quasi-periodic swings in polarization angle predicted by Lense-Thirring precession of the inner accretion flow. This is contingent on the mean polarization degree of the source being greater than ˜4-5 per cent, which is consistent with the best current constraints on Cygnus X-1 from the late 1970s.

  2. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    PubMed

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  3. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  4. Recurrence interval analysis of trading volumes

    NASA Astrophysics Data System (ADS)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q . The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  5. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Recurrence interval analysis of trading volumes.

    PubMed

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  7. 2D Affine and Projective Shape Analysis.

    PubMed

    Bryner, Darshan; Klassen, Eric; Huiling Le; Srivastava, Anuj

    2014-05-01

    Current techniques for shape analysis tend to seek invariance to similarity transformations (rotation, translation, and scale), but certain imaging situations require invariance to larger groups, such as affine or projective groups. Here we present a general Riemannian framework for shape analysis of planar objects where metrics and related quantities are invariant to affine and projective groups. Highlighting two possibilities for representing object boundaries-ordered points (or landmarks) and parameterized curves-we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussian-type statistical models, and classifying test shapes using such models learned from training data. In the case of parameterized curves, we also achieve the desired goal of invariance to re-parameterizations. The geodesics are constructed by particularizing the path-straightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussian-type shape models. We demonstrate these ideas using a number of examples from shape and activity recognition.

  8. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  9. A naturalistic study on the relationship among resilient factors, psychiatric symptoms, and psychosocial functioning in a sample of residential patients with psychosis.

    PubMed

    Poloni, Nicola; Zizolfi, Daniele; Ielmini, Marta; Pagani, Roberto; Caselli, Ivano; Diurni, Marcello; Milano, Anna; Callegari, Camilla

    2018-01-01

    Resilience is a multidimensional process of adaptation aimed to overcome stressful or traumatic life experiences; only in the last few years it has been considered as a personal resource in psychosis and schizophrenia. This study aimed to assess the relationship between intrapersonal and interpersonal resilience factors and schizophrenia, particularly whether and how resilience can improve the course of psychotic illness. In this observational study, all patients recruited had to fulfill the following inclusion criteria: diagnosis of schizophrenia spectrum disorder ( Diagnostic and Statistical Manual of Mental Disorders-5 ); aged between 18 and 65 years; provided written informed consent; to be clinically stable (Clinical Global Impression Scale <3); history of illness ≥5 years; to be compliant with antipsychotic therapy over the last year; and regular submission to periodic monthly psychiatric visits. Patients were evaluated through the following scales: Resilience Scale for Adults (RSA) for resilience; Brief Psychiatric Rating Scale-Anchored version (BPRS-A), Scale for the Assessment of Negative Symptoms (SANS), and Scale for the Assessment of Positive Symptoms (SAPS) for psychotic symptomatology; and Life Skills Profile (LSP) for psychosocial functioning. Statistical analysis was performed by SPSS. Partial correlations were evaluated to assess the relationship between RSA total scores and subscores and BPRS-A, SANS, SAPS, and LSP total scores, removing the common variance among variables. Then, a series of hierarchical multiple linear regression models were used to examine the association between resilience, psychopathology, and psychosocial functioning. A statistically significant negative correlation among intrapersonal resilience factors and BPRS-A total score emerged, predicting psychiatric symptoms severity and explaining approximately 31% of the BPRS-A variance; otherwise, only the interpersonal resilience factors associated with social support were statistically and positively correlated with LSP total score, predicting psychosocial functioning and explaining the 11% of LSP variance. The specific contribution that resilience factors may have in predicting the severity of symptoms and the extent of psychosocial functioning emphasizes the importance of personalizing treatment for patients affected by schizophrenia, promoting personal resources, and translating them into better outcomes.

  10. Towards an automatic statistical model for seasonal precipitation prediction and its application to Central and South Asian headwater catchments

    NASA Astrophysics Data System (ADS)

    Gerlitz, Lars; Gafurov, Abror; Apel, Heiko; Unger-Sayesteh, Katy; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    Statistical climate forecast applications typically utilize a small set of large scale SST or climate indices, such as ENSO, PDO or AMO as predictor variables. If the predictive skill of these large scale modes is insufficient, specific predictor variables such as customized SST patterns are frequently included. Hence statistically based climate forecast models are either based on a fixed number of climate indices (and thus might not consider important predictor variables) or are highly site specific and barely transferable to other regions. With the aim of developing an operational seasonal forecast model, which is easily transferable to any region in the world, we present a generic data mining approach which automatically selects potential predictors from gridded SST observations and reanalysis derived large scale atmospheric circulation patterns and generates robust statistical relationships with posterior precipitation anomalies for user selected target regions. Potential predictor variables are derived by means of a cellwise correlation analysis of precipitation anomalies with gridded global climate variables under consideration of varying lead times. Significantly correlated grid cells are subsequently aggregated to predictor regions by means of a variability based cluster analysis. Finally for every month and lead time, an individual random forest based forecast model is automatically calibrated and evaluated by means of the preliminary generated predictor variables. The model is exemplarily applied and evaluated for selected headwater catchments in Central and South Asia. Particularly the for winter and spring precipitation (which is associated with westerly disturbances in the entire target domain) the model shows solid results with correlation coefficients up to 0.7, although the variability of precipitation rates is highly underestimated. Likewise for the monsoonal precipitation amounts in the South Asian target areas a certain skill of the model could be detected. The skill of the model for the dry summer season in Central Asia and the transition seasons over South Asia is found to be low. A sensitivity analysis by means on well known climate indices reveals the major large scale controlling mechanisms for the seasonal precipitation climate of each target area. For the Central Asian target areas, both, the El Nino Southern Oscillation and the North Atlantic Oscillation are identified as important controlling factors for precipitation totals during moist spring season. Drought conditions are found to be triggered by a warm ENSO phase in combination with a positive phase of the NAO. For the monsoonal summer precipitation amounts over Southern Asia, the model suggests a distinct negative response to El Nino events.

  11. Quantification of improvements in an operational global-scale ocean thermal analysis system. (Reannouncement with new availability information)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, R.M.; Harding, J.M.; Pollak, K.D.

    1992-02-01

    Global-scale analyses of ocean thermal structure produced operationally at the U.S. Navy`s Fleet Numerical Oceanography Center are verified, along with an ocean thermal climatology, against unassimilated bathythermograph (bathy), satellite multichannel sea surface temperature (MCSST), and ship sea surface temperature (SST) data. Verification statistics are calculated from the three types of data for February-April of 1988 and February-April of 1990 in nine verification areas covering most of the open ocean in the Northern Hemisphere. The analyzed thermal fields were produced by version 1.0 of the Optimum Thermal Interpolation System (OTIS 1.0) in 1988, but by an upgraded version of this model,more » referred to as OTIS 1.1, in 1990. OTIS 1.1 employs exactly the same analysis methodology as OTIS 1.0. The principal difference is that OTIS 1.1 has twice the spatial resolution of OTIS 1.0 and consequently uses smaller spatial decorrelation scales and noise-to-signal ratios. As a result, OTIS 1.1 is able to represent more horizontal detail in the ocean thermal fields than its predecessor. Verification statistics for the SST fields derived from bathy and MCSST data are consistent with each other, showing similar trends and error levels. These data indicate that the analyzed SST fields are more accurate in 1990 than in 1988, and generally more accurate than climatology for both years. Verification statistics for the SST fields derived from ship data are inconsistent with those derived from the bathy and MCSST data, and show much higher error levels indicative of observational noise.« less

  12. The discriminative capacity of CBCL/1½-5-DSM5 scales to identify disruptive and internalizing disorders in preschool children.

    PubMed

    de la Osa, Nuria; Granero, Roser; Trepat, Esther; Domenech, Josep Maria; Ezpeleta, Lourdes

    2016-01-01

    This paper studies the discriminative capacity of CBCL/1½-5 (Manual for the ASEBA Preschool-Age Forms & Profiles, University of Vermont, Research Center for Children, Youth, & Families, Burlington, 2000) DSM5 scales attention deficit and hyperactivity disorder (ADHD), oppositional defiant disorder (ODD), anxiety and depressive problems for detecting the presence of DSM5 (DSM5 diagnostic and statistical manual of mental disorders, APA, Arlington, 2013) disorders, ADHD, ODD, Anxiety and Mood disorders, assessed through diagnostic interview, in children aged 3-5. Additionally, we compare the clinical utility of the CBCL/1½-5-DSM5 scales with respect to analogous CBCL/1½-5 syndrome scales. A large community sample of 616 preschool children was longitudinally assessed for the stated age group. Statistical analysis was based on ROC procedures and binary logistic regressions. ADHD and ODD CBCL/1½-5-DSM5 scales achieved good discriminative ability to identify ADHD and ODD interview's diagnoses, at any age. CBCL/1½-5-DSM5 Anxiety scale discriminative capacity was fair for unspecific anxiety disorders in all age groups. CBCL/1½-5-DSM5 depressive problems' scale showed the poorest discriminative capacity for mood disorders (including depressive episode with insufficient symptoms), oscillating into the poor-to-fair range. As a whole, DSM5-oriented scales generally did not provide evidence better for discriminative capacity than syndrome scales in identifying DSM5 diagnoses. CBCL/1½-5-DSM5 scales discriminate externalizing disorders better than internalizing disorders for ages 3-5. Scores on the ADHD and ODD CBCL/1½-5-DSM5 scales can be used to screen for DSM5 ADHD and ODD disorders in general populations of preschool children.

  13. Impact of Deforestation and Recovery on Streamflow Recession Statistics

    NASA Astrophysics Data System (ADS)

    Krapu, C.; Kumar, M.

    2016-12-01

    Deforestation is known to influence streamflow and baseflow in particular in sub-humid environments. Baseflow contributions to the recession limb of a flood hydrograph convey information about subsurface stores from which trees also draw water. Recent works based on the assumptions outlined by Brutsaert and Nieber (1977) have proposed analyzing streamflow recession curves on a per-event basis. In this framework, each event's recession curve is governed by a power law relation with per-event scale and shape coefficients. As streamflow recession depends in part upon evapotranspiration demand from trees, these coefficients are hypothesized to contain useful information about catchment vegetation. Analysis was conducted of 13 small experimental catchments in the eastern United States with known forest treatment histories to determine whether or not streamflow recession behavior as observed from daily discharge records could serve as an indicator of deforestation in the drainage basin. Power-law scale coefficients were calculated for each major stormflow event at each test site and a statistical comparison of distribution of fitted coefficients was made between pre-treatment and post-treatment events as well as between pre-treatment and post-recovery events. A second method using these fitted coefficients in conjunction with Gaussian process regression was employed to track the change in the scale coefficient in the 13 catchments described previously as well as two medium-sized catchments in the North Carolina portion of the American Piedmont which did not have extensive records of forest cover. A linear trend analysis of precipitation was performed to determine whether nonstationarity in rainfall could be a confounding cause of changes in event scale coefficients. These results show a statistically significant difference in scale coefficient values in 5/8 treatment catchments and 0/5 control catchments. This suggests that lesser alterations to forest cover may not be detectable but that this method is robust against changes in precipitation. Additionally, we found clear evidence that forest regrowth in the Piedmont sites continued from 1940-1970. As a proof-of-concept, this work suggests that major alterations to forest cover can be inferred from daily data of stream discharge.

  14. A randomized approach to speed up the analysis of large-scale read-count data in the application of CNV detection.

    PubMed

    Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin

    2018-03-01

    The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.

  15. Relative spatial soil geochemical variability along two transects across the United States and Canada

    USGS Publications Warehouse

    Garrett, Robert G.

    2009-01-01

    The patterns of relative variability differ by transect and horizon. The N–S transect A-horizon soils show significant between-40-km scale variability for 29 elements, with only 4 elements (Ca, Mg, Pb and Sr) showing in excess of 50% of their variability at the within-40-km and ‘at-site’ scales. In contrast, the C-horizon data demonstrate significant between-40-km scale variability for 26 elements, with 21 having in excess of 50% of their variability at the within-40-km and ‘at-site’ scales. In 36 instances, the ‘at-site’ variability is statistically significant in terms of the sample preparation and analysis variability. It is postulated that this contrast between the A- and C- horizons along the N–S transect, that is dominated by agricultural land uses, is due to the local homogenization of Ap-horizon soils by tillage reducing the ‘at-site’ variability. The spatial variability is distributed similarly between scales for the A- and C-horizon soils of the E–W transect. For all elements, there is significant variability at the within-40-km scale. Notwithstanding this, there is significant between-40-km variability for 28 and 20 of the elements in the A- and C-horizon data, respectively. The differences between the two transects are attributed to (1) geology, the N–S transect runs generally parallel to regional strikes, whereas the E–W transect runs across regional structures and lithologies; and (2) land use, with agricultural tillage dominating along the N–S transect. The spatial analysis of the transect data indicates that continental-scale maps demonstrating statistically significant patterns of geochemical variability may be prepared for many elements from data on soil samples collected on a 40 x 40 km grid or similar sampling designs resulting in a sample density of 1 site per 1600 km2.

  16. Clinical utility and validation of the Couple's Communicative Evaluation Scale.

    PubMed

    West, Craig E

    2005-10-01

    This study assessed the validity and clinical utility of a new test, the Couple's Communicative Evaluation Scale. With 24 couples from a variety of resources, e.g., churches, newspaper, and colleges, a discriminant analysis using the Dyadic Adjustment Scale, indicated that satisfied couples could be discriminated from issatisfied couples with 91-96% accuracy. Significant differences on the scale were found for means between 7 distressed and 16 nondistressed couples using the satisfaction/dissatisfaction cutoff score of 200 on the Dyadic Adjustment Scale and significant differences on the individual scales were found for means between 16 distressed and 31 nondistressed individuals using the satisfaction/dissatisfaction cutoff score of 100 on the Dyadic Adjustment Scale. Demographic variables, e.g., age, marriage length, were statistically significant. Scale scores were highly correlated with those on the Dyadic Adjustment Scale, indicating good validity. Using all 400 items, an alpha of .99 indicated good internal consistency for the verbal, nonverbal, and listening communication scores.

  17. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  18. Statistical trend analysis and extreme distribution of significant wave height from 1958 to 1999 - an application to the Italian Seas

    NASA Astrophysics Data System (ADS)

    Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.

    2010-06-01

    The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 80's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The r-largest annual maxima method provides more reliable predictions of the extreme values especially for small return periods (<100 years). Finally, the study statistically proves the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.

  19. Dynamics of land change in India: a fine-scale spatial analysis

    NASA Astrophysics Data System (ADS)

    Meiyappan, P.; Roy, P. S.; Sharma, Y.; Jain, A. K.; Ramachandran, R.; Joshi, P. K.

    2015-12-01

    Land is scarce in India: India occupies 2.4% of worlds land area, but supports over 1/6th of worlds human and livestock population. This high population to land ratio, combined with socioeconomic development and increasing consumption has placed tremendous pressure on India's land resources for food, feed, and fuel. In this talk, we present contemporary (1985 to 2005) spatial estimates of land change in India using national-level analysis of Landsat imageries. Further, we investigate the causes of the spatial patterns of change using two complementary lines of evidence. First, we use statistical models estimated at macro-scale to understand the spatial relationships between land change patterns and their concomitant drivers. This analysis using our newly compiled extensive socioeconomic database at village level (~630,000 units), is 100x higher in spatial resolution compared to existing datasets, and covers over 200 variables. The detailed socioeconomic data enabled the fine-scale spatial analysis with Landsat data. Second, we synthesized information from over 130 survey based case studies on land use drivers in India to complement our macro-scale analysis. The case studies are especially useful to identify unobserved variables (e.g. farmer's attitude towards risk). Ours is the most detailed analysis of contemporary land change in India, both in terms of national extent, and the use of detailed spatial information on land change, socioeconomic factors, and synthesis of case studies.

  20. The Validity and Reliability of the Turkish Version of the Neonatal Skin Risk Assessment Scale.

    PubMed

    Sari, Çiğdem; Altay, Naime

    2017-03-01

    The study created a Turkish translation of the Neonatal Skin Risk Assessment Scale (NSRAS) that was developed by Huffines and Longsdon in 1997. Study authors used a cross-sectional survey design in order to determine the validity and reliability of the Turkish translation. The study was conducted at the neonatal intensive care unit of a university hospital in Ankara between March 15 and June 30, 2014. The research sample included 130 neonatal assessments from 17 patients. Data were collected by questionnaire regarding the characteristics of the participating neonates, 7 nurse observers, and the NSRAS and its subarticles. After translation and back-translation were performed to assess language validity of the scale, necessary corrections were made in line with expert suggestions, and content validity was ensured. Internal consistency of the scale was assessed by its homogeneity, Cronbach's α, and subarticle-general scale grade correlation. Cronbach's α for the scale overall was .88, and Cronbach's α values for the subarticles were between .83 and .90. Results showed a positive relationship among all the subarticles and the overall NSRAS scale grade (P < .01) with correlation values between 0.333 and 0.721. Explanatory and predicative factor analysis was applied for structural validity. Kaiser-Meyer-Olkin analysis was applied for sample sufficiency, and Bartlett test analysis was applied in order to assess the factor analysis of the sample. The Kaiser-Meyer-Olkin coefficient was 0.73, and the χ value found according to the Bartlett test was statistically significant at an advanced level (P < .05). In the 6 subarticles of the scale and in the general scale total grade, a high, positive, and significant relationship among the grades given by the researcher and the nurse observers was found (P < .05). The Turkish NSRAS is reliable and valid.

  1. Landscape object-based analysis of wetland plant functional types: the effects of spatial scale, vegetation classes and classifier methods

    NASA Astrophysics Data System (ADS)

    Dronova, I.; Gong, P.; Wang, L.; Clinton, N.; Fu, W.; Qi, S.

    2011-12-01

    Remote sensing-based vegetation classifications representing plant function such as photosynthesis and productivity are challenging in wetlands with complex cover and difficult field access. Recent advances in object-based image analysis (OBIA) and machine-learning algorithms offer new classification tools; however, few comparisons of different algorithms and spatial scales have been discussed to date. We applied OBIA to delineate wetland plant functional types (PFTs) for Poyang Lake, the largest freshwater lake in China and Ramsar wetland conservation site, from 30-m Landsat TM scene at the peak of spring growing season. We targeted major PFTs (C3 grasses, C3 forbs and different types of C4 grasses and aquatic vegetation) that are both key players in system's biogeochemical cycles and critical providers of waterbird habitat. Classification results were compared among: a) several object segmentation scales (with average object sizes 900-9000 m2); b) several families of statistical classifiers (including Bayesian, Logistic, Neural Network, Decision Trees and Support Vector Machines) and c) two hierarchical levels of vegetation classification, a generalized 3-class set and more detailed 6-class set. We found that classification benefited from object-based approach which allowed including object shape, texture and context descriptors in classification. While a number of classifiers achieved high accuracy at the finest pixel-equivalent segmentation scale, the highest accuracies and best agreement among algorithms occurred at coarser object scales. No single classifier was consistently superior across all scales, although selected algorithms of Neural Network, Logistic and K-Nearest Neighbors families frequently provided the best discrimination of classes at different scales. The choice of vegetation categories also affected classification accuracy. The 6-class set allowed for higher individual class accuracies but lower overall accuracies than the 3-class set because individual classes differed in scales at which they were best discriminated from others. Main classification challenges included a) presence of C3 grasses in C4-grass areas, particularly following harvesting of C4 reeds and b) mixtures of emergent, floating and submerged aquatic plants at sub-object and sub-pixel scales. We conclude that OBIA with advanced statistical classifiers offers useful instruments for landscape vegetation analyses, and that spatial scale considerations are critical in mapping PFTs, while multi-scale comparisons can be used to guide class selection. Future work will further apply fuzzy classification and field-collected spectral data for PFT analysis and compare results with MODIS PFT products.

  2. Effects of Large-Scale Solar Installations on Dust Mobilization and Air Quality

    NASA Astrophysics Data System (ADS)

    Pratt, J. T.; Singh, D.; Diffenbaugh, N. S.

    2012-12-01

    Large-scale solar projects are increasingly being developed worldwide and many of these installations are located in arid, desert regions. To examine the effects of these projects on regional dust mobilization and air quality, we analyze aerosol product data from NASA's Multi-angle Imaging Spectroradiometer (MISR) at annual and seasonal time intervals near fifteen photovoltaic and solar thermal stations ranging from 5-200 MW (12-4,942 acres) in size. The stations are distributed over eight different countries and were chosen based on size, location and installation date; most of the installations are large-scale, took place in desert climates and were installed between 2006 and 2010. We also consider air quality measurements of particulate matter between 2.5 and 10 micrometers (PM10) from the Environmental Protection Agency (EPA) monitoring sites near and downwind from the project installations in the U.S. We use monthly wind data from the NOAA's National Center for Atmospheric Prediction (NCEP) Global Reanalysis to select the stations downwind from the installations, and then perform statistical analysis on the data to identify any significant changes in these quantities. We find that fourteen of the fifteen regions have lower aerosol product after the start of the installations as well as all six PM10 monitoring stations showing lower particulate matter measurements after construction commenced. Results fail to show any statistically significant differences in aerosol optical index or PM10 measurements before and after the large-scale solar installations. However, many of the large installations are very recent, and there is insufficient data to fully understand the long-term effects on air quality. More data and higher resolution analysis is necessary to better understand the relationship between large-scale solar, dust and air quality.

  3. [Effect of occupational stress on mental health].

    PubMed

    Yu, Shan-fa; Zhang, Rui; Ma, Liang-qing; Gu, Gui-zhen; Yang, Yan; Li, Kui-rong

    2003-02-01

    To study the effect of job psychological demands and job control on mental health and their interaction. 93 male freight train dispatchers were evaluated by using revised Job Demand-Control Scale and 7 strain scales. Stepwise regression analysis, Univariate ANOVA, Kruskal-Wallis H and Modian methods were used in statistic analysis. Kruskal-Wallis H and Modian methods analysis revealed the difference in mental health scores among groups of decision latitude (mean rank 55.57, 47.95, 48.42, 33.50, P < 0.05), the differences in scores of mental health (37.45, 40.01, 58.35), job satisfaction (53.18, 46.91, 32.43), daily life strains (33.00, 44.96, 56.12) and depression (36.45, 42.25, 53.61) among groups of job time demands (P < 0.05) were all statistically significant. ANOVA showed that job time demands and decision latitude had interaction effects on physical complains (R(2) = 0.24), state-anxiety (R(2) = 0.26), and daytime fatigue (R(2) = 0.28) (P < 0.05). Regression analysis revealed a significant job time demands and job decision latitude interaction effect as well as significant main effects of the some independent variables on different job strains (R(2) > 0.05). Job time demands and job decision latitude have direct and interactive effects on psychosomatic health, the more time demands, the more psychological strains, the effect of job time demands is greater than that of job decision latitude.

  4. Atomic force microscopy and scanning electron microscopy evaluation of efficacy of scaling and root planing using magnification: A randomized controlled clinical study

    PubMed Central

    Mohan, Ranjana; Agrawal, Sudhanshu; Gundappa, Mohan

    2013-01-01

    Aim: A randomized controlled clinical study was undertaken to evaluate the effectiveness of scaling and root planing (SRP) by using Magnifying Loupes (ML) and dental operating microscope (DOM). Materials and Methods: A total of 90 human teeth scheduled for extraction from 18 patients aged between 25 and 65 years suffering from generalized chronic severe periodontitis were randomly assigned to three treatment groups. Group 1 consisted SRP performed without using magnification (unaided), Group 2-SRP with ML and Group 3-SRP with DOM. Following extractions, samples were prepared for (i) evaluation of surface topography by atomic force microscopy, (ii) presence of smear layer, debris by scanning electron microscopy (iii) elemental analysis by energy dispersive X-ray analysis. Data was subjected to statistical analysis using analysis of variance, post-hoc (Tukey-HSD) and Chi-square test. Results: Statistically significant (P < 0.001) difference was found among the different treatment groups. Group 3 was the best while Group 1 was the least effective technique for SRP. Order of efficacy in terms of the surface was found to be - Palatal < Lingual < Distal ≃ Mesial < Buccal. Efficiency in mandibular to maxillary teeth was found to be significant (P < 0.05), also anterior to posterior teeth (P < 0.05). Conclusion: Magnification tools significantly enhance the efficacy of supragingival and subgingival SRP. PMID:24124292

  5. Gender-, age-, and race/ethnicity-based differential item functioning analysis of the movement disorder society-sponsored revision of the Unified Parkinson's disease rating scale.

    PubMed

    Goetz, Christopher G; Liu, Yuanyuan; Stebbins, Glenn T; Wang, Lu; Tilley, Barbara C; Teresi, Jeanne A; Merkitch, Douglas; Luo, Sheng

    2016-12-01

    Assess MDS-UPDRS items for gender-, age-, and race/ethnicity-based differential item functioning. Assessing differential item functioning is a core rating scale validation step. For the MDS-UPDRS, differential item functioning occurs if item-score probability among people with similar levels of parkinsonism differ according to selected covariates (gender, age, race/ethnicity). If the magnitude of differential item functioning is clinically relevant, item-score interpretation must consider influences by these covariates. Differential item functioning can be nonuniform (covariate variably influences an item-score across different levels of parkinsonism) or uniform (covariate influences an item-score consistently over all levels of parkinsonism). Using the MDS-UPDRS translation database of more than 5,000 PD patients from 14 languages, we tested gender-, age-, and race/ethnicity-based differential item functioning. To designate an item as having clinically relevant differential item functioning, we required statistical confirmation by 2 independent methods, along with a McFadden pseudo-R 2 magnitude statistic greater than "negligible." Most items showed no gender-, age- or race/ethnicity-based differential item functioning. When differential item functioning was identified, the magnitude statistic was always in the "negligible" range, and the scale-level impact was minimal. The absence of clinically relevant differential item functioning across all items and all parts of the MDS-UPDRS is strong evidence that the scale can be used confidently. As studies of Parkinson's disease increasingly involve multinational efforts and the MDS-UPDRS has several validated non-English translations, the findings support the scale's broad applicability in populations with varying gender, age, and race/ethnicity distributions. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.

  6. Analysis of Flow and Transport in non-Gaussian Heterogeneous Formations Using a Generalized Sub-Gaussian Model

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Neuman, S. P.

    2016-12-01

    Environmental quantities such as log hydraulic conductivity (or transmissivity), Y(x) = ln K(x), and their spatial (or temporal) increments, ΔY, are known to be generally non-Gaussian. Documented evidence of such behavior includes symmetry of increment distributions at all separation scales (or lags) between incremental values of Y with sharp peaks and heavy tails that decay asymptotically as lag increases. This statistical scaling occurs in porous as well as fractured media characterized by either one or a hierarchy of spatial correlation scales. In hierarchical media one observes a range of additional statistical ΔY scaling phenomena, all of which are captured comprehensibly by a novel generalized sub-Gaussian (GSG) model. In this model Y forms a mixture Y(x) = U(x) G(x) of single- or multi-scale Gaussian processes G having random variances, U being a non-negative subordinator independent of G. Elsewhere we developed ways to generate unconditional and conditional random realizations of isotropic or anisotropic GSG fields which can be embedded in numerical Monte Carlo flow and transport simulations. Here we present and discuss expressions for probability distribution functions of Y and ΔY as well as their lead statistical moments. We then focus on a simple flow setting of mean uniform steady state flow in an unbounded, two-dimensional domain, exploring ways in which non-Gaussian heterogeneity affects stochastic flow and transport descriptions. Our expressions represent (a) lead order autocovariance and cross-covariance functions of hydraulic head, velocity and advective particle displacement as well as (b) analogues of preasymptotic and asymptotic Fickian dispersion coefficients. We compare them with corresponding expressions developed in the literature for Gaussian Y.

  7. Structure of small-scale magnetic fields in the kinematic dynamo theory.

    PubMed

    Schekochihin, Alexander; Cowley, Steven; Maron, Jason; Malyshkin, Leonid

    2002-01-01

    A weak fluctuating magnetic field embedded into a a turbulent conducting medium grows exponentially while its characteristic scale decays. In the interstellar medium and protogalactic plasmas, the magnetic Prandtl number is very large, so a broad spectrum of growing magnetic fluctuations is excited at small (subviscous) scales. The condition for the onset of nonlinear back reaction depends on the structure of the field lines. We study the statistical correlations that are set up in the field pattern and show that the magnetic-field lines possess a folding structure, where most of the scale decrease is due to the field variation across itself (rapid transverse direction reversals), while the scale of the field variation along itself stays approximately constant. Specifically, we find that, though both the magnetic energy and the mean-square curvature of the field lines grow exponentially, the field strength and the field-line curvature are anticorrelated, i.e., the curved field is relatively weak, while the growing field is relatively flat. The detailed analysis of the statistics of the curvature shows that it possesses a stationary limiting distribution with the bulk located at the values of curvature comparable to the characteristic wave number of the velocity field and a power tail extending to large values of curvature where it is eventually cut off by the resistive regularization. The regions of large curvature, therefore, occupy only a small fraction of the total volume of the system. Our theoretical results are corroborated by direct numerical simulations. The implication of the folding effect is that the advent of the Lorentz back reaction occurs when the magnetic energy approaches that of the smallest turbulent eddies. Our results also directly apply to the problem of statistical geometry of the material lines in a random flow.

  8. A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT

    2007-01-30

    The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less

  9. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  10. Optical technique for inner-scale measurement: possible astronomical applications.

    PubMed

    Masciadri, E; Vernin, J

    1997-02-20

    We propose an optical technique that allows us to estimate the inner scale by measuring the variance of angle of arrival fluctuations of collimated laser beams of different sections w (i) passing through a turbulent layer. To test the potential efficiency of the system, we made measurements on a turbulent air flow generated in the laboratory, the statistical properties of which are known and controlled, unlike atmospheric turbulence. We deduced a Kolmogorov behavior with a 6-mm inner scale and a 90-mm outer scale in accordance with measurements by a more complicated technique using the same turbulent channel. Our proposed method is especially sensitive to inner-scale measurement and can be adapted easily to atmospheric turbulence analysis. We propose an outdoor experimental setup that should work in less controlled conditions that can affect astronomical observations. The inner-scale assessment might be important when phase retrieval with Laplacian methods is used for adaptive optics purposes.

  11. Home Safety, Safe Behaviors of Elderly People, and Fall Accidents At Home

    ERIC Educational Resources Information Center

    Erkal, Sibel

    2010-01-01

    The present study analyzed home safety and safe behaviors against fall accidents of elderly people living at home. The study group comprised 121 people aged 65+ living in the catchment area of Ankara Mamak Halil Ulgen Health Center. Data were collected via a personal information form and Home-Screen Scale. Statistical analysis used an independent…

  12. Assessing Beaked Whale Reproduction and Stress Response Relative to Sonar Activity at the Atlantic Undersea Test and Evaluation Center (AUTEC)

    DTIC Science & Technology

    2015-09-30

    oil spills (unpublished data, Kellar). The second will be to conduct a more fine-scale analysis of the areas examined during this study. For this...REFERENCES Carlin BP , Chib S (1995) Bayesian model choice via Markov-chain Monte-Carlo methods. Journal of the Royal Statistical Society

  13. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    ERIC Educational Resources Information Center

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  14. Applications of artificial intelligence systems in the analysis of epidemiological data.

    PubMed

    Flouris, Andreas D; Duffy, Jack

    2006-01-01

    A brief review of the germane literature suggests that the use of artificial intelligence (AI) statistical algorithms in epidemiology has been limited. We discuss the advantages and disadvantages of using AI systems in large-scale sets of epidemiological data to extract inherent, formerly unidentified, and potentially valuable patterns that human-driven deductive models may miss.

  15. Summary of mortality statistics and forest health monitoring results for the Northeastern United States

    Treesearch

    William H. McWilliams; Stanford L. Arner; Charles J. Barnett

    1997-01-01

    The USDA Forest Service's Forest Inventory and Analysis (FIA) program and the Forest Health Monitoring (FHM) program maintain networks of sample locations providing coarse-scale information that characterize general indicators of forest health. Tree mortality is the primary FIA variable for analyzing forest health. Recent FIA inventories of New York, Pennsylvania...

  16. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Recommendations to improve the positive and negative syndrome scale (PANSS) based on item response theory.

    PubMed

    Levine, Stephen Z; Rabinowitz, Jonathan; Rizopoulos, Dimitris

    2011-08-15

    The adequacy of the Positive and Negative Syndrome Scale (PANSS) items in measuring symptom severity in schizophrenia was examined using Item Response Theory (IRT). Baseline PANSS assessments were analyzed from two multi-center clinical trials of antipsychotic medication in chronic schizophrenia (n=1872). Generally, the results showed that the PANSS (a) item ratings discriminated symptom severity best for the negative symptoms; (b) has an excess of "Severe" and "Extremely severe" rating options; and (c) assessments are more reliable at medium than very low or high levels of symptom severity. Analysis also showed that the detection of statistically and non-statistically significant differences in treatment were highly similar for the original and IRT-modified PANSS. In clinical trials of chronic schizophrenia, the PANSS appears to require the following modifications: fewer rating options, adjustment of 'Lack of judgment and insight', and improved severe symptom assessment. 2011 Elsevier Ltd. All rights reserved.

  18. On an additive partial correlation operator and nonparametric estimation of graphical models.

    PubMed

    Lee, Kuang-Yao; Li, Bing; Zhao, Hongyu

    2016-09-01

    We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance.

  19. On an additive partial correlation operator and nonparametric estimation of graphical models

    PubMed Central

    Li, Bing; Zhao, Hongyu

    2016-01-01

    Abstract We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance. PMID:29422689

  20. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

Top